Data Challenge : Challenge Large Scale Machine Learning

Hugo Michel, Février 2022

Data Challenge Description : Face Recognition

In recent years, face recognition systems have achieved extremely high levels of performance, opening the door to a wider range of applications where reliability levels were previously prohibitive to consider automation. This is mainly due to the adoption of deep learning techniques in computer vision. The most widely adopted paradigm is to train a $f: \mathcal{X} \rightarrow \mathbb{R}^d$ which, from a given image $im \in \mathcal{X}$, extracts a feature vector $z \in \mathbb{R}^d$ which synthesizes the relevant features of $im$.

The recognition phase then consists, from two images $im_1, im_2$, in predicting whether or not they correspond to the same identity. This is done from the extracted features $z_1, z_2$.

Goal

In this data challenge, the goal is to train a machine learning model that, given a vector $[z_1, z_2]$ consisting of the concatenation of two patterns $z_1$ and $z_2$, predicts whether or not these two images match the same identity.

Training Data

The train set consists of two files train_data.npy and train_labels.txt

The train_data.npy file contains one observation per line, which consists of the concatenation of two templates, each of dimension 48

The file train_labels.npy contains two classes labeled per line that indicate whether the image pair matches the same identity:

  • 1 => image pairs belonging to the same identity
  • 0 => image pairs not belonging to the same identity

Performance

For the evaluation of the performance of the models, the idea is to minimize the sum of the rate of false positives rate FPR and the rate of false negatives rate FNR. The performance score of the model is calculated using the following equation.

$score = 1 - (FPR + FNR)$

-------- Import libs --------

In [1]:
### Install requirements ###
#!pip install featurewiz
#!pip install scikit_optimize
#!pip3 install catboost

### Data transformation libs ###
import numpy as np
import pandas as pd

### Viz libs ###
import matplotlib
import matplotlib.pyplot as plt
import matplotlib.colors as mcolors
import seaborn as sns
from statsmodels.graphics.tsaplots import plot_acf,plot_pacf

### Features selection libs ###
from sklearn.decomposition import PCA
from sklearn.feature_selection import RFE
from sklearn.linear_model import ElasticNet
from itertools import product
from featurewiz import featurewiz

### Models selection libs ###
from sklearn.model_selection import cross_val_score
from sklearn.model_selection import RepeatedStratifiedKFold
from sklearn.model_selection import GridSearchCV
from skopt import BayesSearchCV

### Metrics Evaluation libs ###
from sklearn.metrics import accuracy_score,f1_score,roc_auc_score,confusion_matrix,roc_curve
from sklearn.metrics import make_scorer

### ML libs ###
from sklearn.model_selection import GridSearchCV,RandomizedSearchCV,train_test_split
from sklearn.ensemble import RandomForestClassifier
from sklearn.neighbors import KNeighborsClassifier
from sklearn.linear_model import LogisticRegression
from sklearn.svm import SVC
import xgboost as xgb
from xgboost import XGBClassifier
from sklearn.ensemble import ExtraTreesClassifier
from sklearn.ensemble import AdaBoostClassifier
from sklearn.tree import DecisionTreeClassifier
from sklearn.ensemble import GradientBoostingClassifier
from lightgbm import LGBMClassifier
from catboost import CatBoostClassifier
from sklearn.ensemble import VotingClassifier

### Deep Learning libs ###
import tensorflow as tf
from tensorflow import keras

### options ###
np.random.seed(seed=42)
pd.set_option('max_columns', 100)
pd.set_option('max_rows', 100)
Imported featurewiz. Version = 0.1.04. nrows=None uses all rows. Set nrows=1000 to randomly sample 1000 rows.
outputs = featurewiz(dataname, target, corr_limit=0.70, verbose=2, sep=',', 
		header=0, test_data='',feature_engg='', category_encoders='',
		dask_xgboost_flag=False, nrows=None)
Create new features via 'feature_engg' flag : ['interactions','groupby','target']
                                

-------- Import data --------

In [2]:
def extract_labels(txt_file):
    """
    Extraction des labels from a text file
    ---- PARAMETERS ----
    Input : text file (.txt)
    Return : labels as numpy format
    """
    with open(txt_file) as file:
        lines = file.readlines()
    y = []
    for elem in lines:
        label = int(elem[0])
        y.append(label)
    y = np.array(y)
    return y

Read train_data and train_labels

In [3]:
X, y = np.load("train_data.npy"), extract_labels("train_labels.txt")

Convert train_data to dataframe

In [4]:
X_copied = X.copy()
X_dataframe = pd.DataFrame(X_copied)

Convert train_labels to dataframe

In [5]:
y_copied = y.copy()
y_dataframe = pd.DataFrame(y_copied)

-------- Exploratory Data Analysis --------

Data Analysis for train_data

Visualisation of train_data

In [10]:
X_dataframe.head(10)
Out[10]:
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95
0 236.031 1.000 0.0 0.000 -2.134 -3.195 1.100 23.831 1.0 0.001 1.0 1.0 0.0 0.0 0.000 255.0 255.0 193.0 255.0 160.0 0.0 217.0 79.0 175.0 194.0 221.0 254.0 255.0 254.0 245.0 7.806 4.148 -0.411 2.673 1.451 84.0 0.945000 0.986000 0.621000 0.993000 0.686000 0.375000 0.998000 0.801000 0.0 3.021000 0.0 0.0 238.253 1.000 0.000 0.000 26.322 -9.786 -5.224 26.601 1.0 0.001 1.0 1.0 0.0 0.0 0.000 255.0 255.0 173.0 255.0 255.0 0.0 110.0 0.0 175.0 153.0 136.0 247.0 255.0 254.0 175.0 1.059 1.926 26.732 3.698 -1.463 78.0 0.929000 0.988000 0.643000 0.912000 0.995000 0.335000 0.995000 0.829000 0.0 2.503000 0.0 0.0
1 228.040 1.000 0.0 0.000 -0.860 -15.950 -1.410 58.040 0.0 1.000 1.0 1.0 0.0 0.0 0.000 255.0 255.0 255.0 212.0 255.0 255.0 213.0 96.0 172.0 188.0 226.0 254.0 196.0 225.0 210.0 2.570 1.130 -2.610 -4.070 -2.220 121.0 0.823407 0.993560 0.918158 0.858533 0.932080 0.649915 0.902721 0.869304 0.0 2.331588 1.0 0.0 107.160 1.000 0.000 0.000 3.690 9.150 0.840 59.800 0.0 0.880 1.0 1.0 0.0 0.0 0.670 255.0 170.0 181.0 229.0 255.0 255.0 141.0 22.0 175.0 193.0 212.0 220.0 255.0 254.0 158.0 -7.620 1.130 5.470 23.950 4.720 96.0 0.985727 0.996261 0.637621 0.847861 0.927677 0.403857 0.978128 0.826418 0.0 2.245238 1.0 0.0
2 158.310 1.000 0.0 0.000 -2.290 -7.680 0.120 22.770 1.0 0.000 1.0 1.0 0.0 0.0 0.000 255.0 201.0 255.0 248.0 0.0 255.0 220.0 236.0 175.0 194.0 202.0 254.0 255.0 254.0 247.0 5.660 1.540 -1.620 -0.060 -0.750 63.0 0.616242 0.985232 0.961017 0.995382 0.696401 0.768143 0.993044 0.860056 0.0 2.580157 0.0 0.0 214.620 1.000 0.000 0.000 1.240 -17.120 0.620 23.650 1.0 0.910 1.0 1.0 0.0 0.0 0.720 255.0 90.0 3.0 219.0 0.0 255.0 220.0 255.0 175.0 194.0 209.0 254.0 255.0 225.0 242.0 6.950 1.850 2.830 -4.280 2.890 60.0 0.718895 0.999113 0.611766 0.617831 0.864550 0.388376 0.922516 0.731915 0.0 3.006712 1.0 0.0
3 165.464 0.284 0.0 0.716 38.303 -16.267 -9.469 45.229 1.0 0.003 1.0 1.0 0.0 0.0 0.001 255.0 150.0 93.0 169.0 32.0 0.0 110.0 0.0 175.0 171.0 130.0 11.0 221.0 225.0 137.0 -10.177 2.300 57.818 1.508 1.880 56.0 0.897000 0.992000 0.500000 0.826000 0.993000 0.853000 0.990000 0.865000 0.0 -0.883000 0.0 0.0 173.099 1.000 0.000 0.000 -35.935 -6.119 0.461 30.383 1.0 0.003 1.0 0.0 0.0 1.0 0.001 255.0 205.0 147.0 237.0 177.0 0.0 110.0 0.0 175.0 178.0 133.0 0.0 255.0 225.0 170.0 -7.922 4.290 -35.607 4.839 -5.786 68.0 0.884000 0.993000 0.836000 0.834000 0.939000 0.892000 0.991000 0.911000 0.0 -2.930000 0.0 0.0
4 153.727 1.000 0.0 0.000 0.780 -5.169 -0.291 27.574 0.0 0.001 1.0 1.0 0.0 0.0 0.001 255.0 255.0 218.0 247.0 255.0 0.0 154.0 72.0 175.0 140.0 199.0 171.0 255.0 254.0 145.0 3.239 2.047 0.166 1.760 1.430 88.0 0.545000 0.628000 0.876000 0.999000 0.639000 0.440000 1.002000 0.733000 0.0 0.575000 0.0 0.0 201.940 0.862 0.138 0.000 -5.379 5.491 -0.406 18.810 0.0 0.001 1.0 1.0 0.0 0.0 0.000 255.0 255.0 234.0 216.0 255.0 0.0 160.0 162.0 175.0 194.0 200.0 114.0 255.0 254.0 164.0 -4.820 1.744 -3.066 -2.805 -1.338 88.0 0.914000 0.936000 0.968000 0.963000 0.657000 0.483000 0.987000 0.845000 0.0 -1.016000 0.0 0.0
5 198.550 1.000 0.0 0.000 0.180 -12.890 0.850 18.080 0.0 0.000 1.0 1.0 0.0 0.0 0.000 255.0 255.0 255.0 224.0 255.0 255.0 216.0 0.0 175.0 194.0 222.0 254.0 255.0 254.0 247.0 10.940 1.140 0.610 6.330 2.310 115.0 0.885903 0.988705 0.903482 0.805560 0.985948 0.597785 0.990743 0.880715 0.0 2.743334 0.0 0.0 199.580 1.000 0.000 0.000 2.070 -17.670 0.360 21.540 0.0 0.020 1.0 1.0 0.0 0.0 0.010 255.0 83.0 0.0 225.0 0.0 255.0 179.0 235.0 175.0 194.0 214.0 1.0 255.0 254.0 235.0 6.080 2.150 1.320 -13.540 2.140 51.0 0.819433 0.640720 0.916525 0.996785 0.473596 0.421382 0.897145 0.738814 0.0 -2.751095 0.0 0.0
6 304.509 1.000 0.0 0.000 3.026 -2.632 0.003 35.525 0.0 0.001 1.0 1.0 0.0 0.0 0.001 255.0 255.0 255.0 255.0 255.0 0.0 187.0 0.0 175.0 194.0 222.0 0.0 255.0 254.0 243.0 5.499 2.014 3.763 -4.611 0.782 94.0 0.930000 0.976000 0.794000 0.986000 0.953000 0.439000 0.990000 0.867000 0.0 -3.368000 0.0 0.0 327.375 1.000 0.000 0.000 -1.380 2.758 -0.065 42.020 0.0 0.001 1.0 1.0 0.0 0.0 0.000 255.0 255.0 255.0 255.0 255.0 0.0 220.0 0.0 175.0 194.0 222.0 0.0 255.0 254.0 247.0 4.292 2.251 -2.046 2.627 0.436 95.0 0.962000 0.943000 0.792000 0.988000 0.914000 0.153000 0.991000 0.821000 0.0 -4.298000 0.0 0.0
7 122.628 1.000 0.0 0.000 10.941 -16.446 -1.253 57.604 1.0 0.996 1.0 0.0 1.0 0.0 0.991 255.0 255.0 255.0 218.0 238.0 0.0 124.0 0.0 175.0 0.0 228.0 246.0 255.0 255.0 230.0 -0.166 1.945 7.398 5.916 0.739 88.0 0.847000 0.981000 0.417000 0.972000 0.998000 0.741000 0.992000 0.850000 0.0 1.189000 0.0 1.0 133.086 0.992 0.000 0.008 17.978 -1.784 -1.280 36.681 1.0 0.001 1.0 1.0 0.0 0.0 0.001 255.0 255.0 187.0 234.0 120.0 0.0 110.0 0.0 175.0 194.0 200.0 0.0 255.0 254.0 219.0 6.454 1.948 20.865 5.630 -0.196 88.0 0.818000 0.999000 0.359000 0.833000 0.994000 0.871000 0.998000 0.839000 0.0 -2.916000 0.0 0.0
8 126.430 1.000 0.0 0.000 -0.230 1.370 -0.260 20.150 0.0 0.000 1.0 1.0 0.0 0.0 0.000 255.0 255.0 255.0 239.0 235.0 255.0 219.0 0.0 175.0 194.0 214.0 253.0 255.0 254.0 225.0 11.190 1.340 -0.500 2.580 -2.720 94.0 0.966919 0.994114 0.914191 0.949576 0.843325 0.565733 0.986952 0.889393 0.0 3.055488 0.0 0.0 167.670 1.000 0.000 0.000 -8.400 -8.510 1.000 31.340 0.0 0.000 1.0 1.0 0.0 0.0 0.000 149.0 83.0 0.0 216.0 0.0 255.0 158.0 226.0 175.0 194.0 208.0 244.0 255.0 254.0 235.0 10.110 2.160 0.530 -3.280 -0.890 52.0 0.861544 0.989500 0.862687 0.979145 0.910781 0.319397 0.991307 0.845645 0.0 2.370744 0.0 0.0
9 191.150 1.000 0.0 0.000 1.670 -3.510 -0.280 30.840 0.0 0.000 1.0 1.0 0.0 0.0 0.000 255.0 255.0 255.0 219.0 255.0 255.0 220.0 115.0 175.0 194.0 226.0 254.0 255.0 254.0 249.0 6.200 1.060 1.670 -3.400 -0.220 126.0 0.930284 0.999320 0.897267 0.856447 0.919548 0.398221 0.999174 0.857651 0.0 2.769644 0.0 0.0 105.090 1.000 0.000 0.000 2.670 14.560 -2.640 26.940 0.0 0.000 1.0 1.0 0.0 0.0 0.000 255.0 160.0 166.0 207.0 127.0 0.0 112.0 7.0 175.0 194.0 202.0 254.0 255.0 254.0 219.0 5.650 1.120 5.440 21.670 -0.420 85.0 0.940918 0.989140 0.882326 0.744098 0.925329 0.358148 0.986829 0.833434 0.0 1.727199 0.0 0.0

By visualizing the dataframe we realize that most of the features seem to be quantitative while some features seem to be binary such as col_8, col_94 or col_95.

Moreover some features seem to be between 0 and 1.

However, these hypotheses must be verified by a data mining analysis.

Rename columns

In [11]:
columns_list = list()
for i in range(X_dataframe.shape[1]):
    name_col = "col_"+str(i)
    columns_list.append(name_col)

X_dataframe.columns = columns_list
    
X_dataframe
Out[11]:
col_0 col_1 col_2 col_3 col_4 col_5 col_6 col_7 col_8 col_9 col_10 col_11 col_12 col_13 col_14 col_15 col_16 col_17 col_18 col_19 col_20 col_21 col_22 col_23 col_24 col_25 col_26 col_27 col_28 col_29 col_30 col_31 col_32 col_33 col_34 col_35 col_36 col_37 col_38 col_39 col_40 col_41 col_42 col_43 col_44 col_45 col_46 col_47 col_48 col_49 col_50 col_51 col_52 col_53 col_54 col_55 col_56 col_57 col_58 col_59 col_60 col_61 col_62 col_63 col_64 col_65 col_66 col_67 col_68 col_69 col_70 col_71 col_72 col_73 col_74 col_75 col_76 col_77 col_78 col_79 col_80 col_81 col_82 col_83 col_84 col_85 col_86 col_87 col_88 col_89 col_90 col_91 col_92 col_93 col_94 col_95
0 236.031 1.000 0.000 0.000 -2.134 -3.195 1.100 23.831 1.0 0.001 1.0 1.0 0.0 0.0 0.000 255.0 255.0 193.0 255.0 160.0 0.0 217.0 79.0 175.0 194.0 221.0 254.0 255.0 254.0 245.0 7.806 4.148 -0.411 2.673 1.451 84.0 0.945000 0.986000 0.621000 0.993000 0.686000 0.375000 0.998000 0.801000 0.0 3.021000 0.0 0.0 238.253 1.000 0.000 0.000 26.322 -9.786 -5.224 26.601 1.0 0.001 1.0 1.0 0.0 0.0 0.000 255.0 255.0 173.0 255.0 255.0 0.0 110.0 0.0 175.0 153.0 136.0 247.0 255.0 254.0 175.0 1.059 1.926 26.732 3.698 -1.463 78.0 0.929000 0.988000 0.643000 0.912000 0.995000 0.335000 0.995000 0.829000 0.0 2.503000 0.0 0.0
1 228.040 1.000 0.000 0.000 -0.860 -15.950 -1.410 58.040 0.0 1.000 1.0 1.0 0.0 0.0 0.000 255.0 255.0 255.0 212.0 255.0 255.0 213.0 96.0 172.0 188.0 226.0 254.0 196.0 225.0 210.0 2.570 1.130 -2.610 -4.070 -2.220 121.0 0.823407 0.993560 0.918158 0.858533 0.932080 0.649915 0.902721 0.869304 0.0 2.331588 1.0 0.0 107.160 1.000 0.000 0.000 3.690 9.150 0.840 59.800 0.0 0.880 1.0 1.0 0.0 0.0 0.670 255.0 170.0 181.0 229.0 255.0 255.0 141.0 22.0 175.0 193.0 212.0 220.0 255.0 254.0 158.0 -7.620 1.130 5.470 23.950 4.720 96.0 0.985727 0.996261 0.637621 0.847861 0.927677 0.403857 0.978128 0.826418 0.0 2.245238 1.0 0.0
2 158.310 1.000 0.000 0.000 -2.290 -7.680 0.120 22.770 1.0 0.000 1.0 1.0 0.0 0.0 0.000 255.0 201.0 255.0 248.0 0.0 255.0 220.0 236.0 175.0 194.0 202.0 254.0 255.0 254.0 247.0 5.660 1.540 -1.620 -0.060 -0.750 63.0 0.616242 0.985232 0.961017 0.995382 0.696401 0.768143 0.993044 0.860056 0.0 2.580157 0.0 0.0 214.620 1.000 0.000 0.000 1.240 -17.120 0.620 23.650 1.0 0.910 1.0 1.0 0.0 0.0 0.720 255.0 90.0 3.0 219.0 0.0 255.0 220.0 255.0 175.0 194.0 209.0 254.0 255.0 225.0 242.0 6.950 1.850 2.830 -4.280 2.890 60.0 0.718895 0.999113 0.611766 0.617831 0.864550 0.388376 0.922516 0.731915 0.0 3.006712 1.0 0.0
3 165.464 0.284 0.000 0.716 38.303 -16.267 -9.469 45.229 1.0 0.003 1.0 1.0 0.0 0.0 0.001 255.0 150.0 93.0 169.0 32.0 0.0 110.0 0.0 175.0 171.0 130.0 11.0 221.0 225.0 137.0 -10.177 2.300 57.818 1.508 1.880 56.0 0.897000 0.992000 0.500000 0.826000 0.993000 0.853000 0.990000 0.865000 0.0 -0.883000 0.0 0.0 173.099 1.000 0.000 0.000 -35.935 -6.119 0.461 30.383 1.0 0.003 1.0 0.0 0.0 1.0 0.001 255.0 205.0 147.0 237.0 177.0 0.0 110.0 0.0 175.0 178.0 133.0 0.0 255.0 225.0 170.0 -7.922 4.290 -35.607 4.839 -5.786 68.0 0.884000 0.993000 0.836000 0.834000 0.939000 0.892000 0.991000 0.911000 0.0 -2.930000 0.0 0.0
4 153.727 1.000 0.000 0.000 0.780 -5.169 -0.291 27.574 0.0 0.001 1.0 1.0 0.0 0.0 0.001 255.0 255.0 218.0 247.0 255.0 0.0 154.0 72.0 175.0 140.0 199.0 171.0 255.0 254.0 145.0 3.239 2.047 0.166 1.760 1.430 88.0 0.545000 0.628000 0.876000 0.999000 0.639000 0.440000 1.002000 0.733000 0.0 0.575000 0.0 0.0 201.940 0.862 0.138 0.000 -5.379 5.491 -0.406 18.810 0.0 0.001 1.0 1.0 0.0 0.0 0.000 255.0 255.0 234.0 216.0 255.0 0.0 160.0 162.0 175.0 194.0 200.0 114.0 255.0 254.0 164.0 -4.820 1.744 -3.066 -2.805 -1.338 88.0 0.914000 0.936000 0.968000 0.963000 0.657000 0.483000 0.987000 0.845000 0.0 -1.016000 0.0 0.0
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
297227 77.865 0.989 0.006 0.004 27.117 -25.148 -19.818 26.619 0.0 0.002 1.0 1.0 0.0 0.0 0.001 255.0 255.0 195.0 255.0 171.0 0.0 118.0 0.0 175.0 194.0 220.0 162.0 255.0 254.0 0.0 -1.256 2.123 26.403 -30.177 -11.785 80.0 0.892000 0.974000 -0.038000 0.954000 0.934000 0.333000 1.005000 0.722000 0.0 0.216000 0.0 0.0 97.226 0.286 0.714 0.000 -38.170 -10.770 6.619 27.552 0.0 0.002 1.0 1.0 0.0 0.0 0.001 255.0 255.0 255.0 255.0 255.0 0.0 110.0 102.0 166.0 175.0 130.0 2.0 255.0 254.0 139.0 0.064 1.817 -34.317 -3.931 2.428 67.0 0.757000 0.822000 0.927000 0.993000 0.060000 0.292000 0.980000 0.691000 0.0 -1.718000 0.0 0.0
297228 147.250 1.000 0.000 0.000 3.040 -9.620 -2.790 36.990 0.0 0.000 1.0 1.0 0.0 0.0 0.000 255.0 255.0 255.0 206.0 255.0 255.0 189.0 168.0 175.0 194.0 227.0 254.0 255.0 254.0 242.0 -0.810 1.100 5.540 -5.670 -2.530 117.0 0.932584 0.989263 0.866788 0.730796 1.006490 0.470933 0.993582 0.856815 0.0 2.642143 0.0 0.0 153.420 1.000 0.000 0.000 7.420 -11.340 -8.480 54.030 0.0 0.000 1.0 1.0 0.0 0.0 0.000 255.0 161.0 168.0 184.0 255.0 0.0 124.0 239.0 137.0 194.0 228.0 254.0 255.0 254.0 209.0 -12.290 1.140 11.420 -4.910 -3.900 98.0 0.900837 0.646228 0.796856 0.987863 0.457861 0.305589 0.986106 0.726660 0.0 2.328330 0.0 0.0
297229 219.495 1.000 0.000 0.000 -31.372 -2.752 -2.153 33.358 1.0 0.001 1.0 0.0 0.0 1.0 0.001 255.0 255.0 255.0 255.0 255.0 0.0 110.0 232.0 175.0 177.0 130.0 165.0 255.0 254.0 153.0 -8.352 1.735 -34.138 2.945 -2.204 74.0 0.852000 0.586000 0.693000 1.003000 0.248000 0.287000 0.997000 0.667000 0.0 1.944000 0.0 0.0 193.696 0.989 0.011 0.000 8.224 2.231 -1.572 32.566 1.0 0.001 1.0 0.0 0.0 1.0 0.001 255.0 255.0 196.0 255.0 139.0 0.0 213.0 222.0 175.0 188.0 208.0 254.0 255.0 254.0 240.0 5.959 1.335 6.166 0.892 0.854 89.0 0.973000 0.965000 0.407000 0.981000 0.998000 0.848000 0.990000 0.880000 0.0 1.779000 0.0 0.0
297230 111.054 1.000 0.000 0.000 -13.329 -5.626 3.282 34.638 0.0 0.001 1.0 1.0 0.0 0.0 0.000 255.0 255.0 179.0 253.0 103.0 0.0 112.0 0.0 175.0 194.0 152.0 253.0 255.0 254.0 210.0 -7.390 1.947 -14.144 -1.181 0.357 87.0 0.910000 0.987000 0.702000 0.953000 0.850000 0.324000 0.986000 0.817000 0.0 2.496000 0.0 0.0 63.577 0.995 0.000 0.005 37.341 -5.102 0.309 32.238 0.0 0.002 1.0 1.0 0.0 0.0 0.001 255.0 181.0 128.0 238.0 121.0 0.0 114.0 0.0 81.0 38.0 140.0 44.0 255.0 254.0 178.0 -13.085 4.413 40.631 16.873 9.637 72.0 0.817000 0.978000 0.742000 0.936000 0.465000 0.392000 0.985000 0.760000 0.0 -2.154000 0.0 0.0
297231 221.947 1.000 0.000 0.000 -34.777 -15.352 8.289 27.163 1.0 0.001 1.0 0.0 0.0 1.0 0.001 255.0 255.0 173.0 145.0 255.0 0.0 124.0 0.0 175.0 186.0 133.0 217.0 255.0 254.0 0.0 -6.255 2.125 -35.505 -2.704 2.327 78.0 0.613000 0.991000 0.863000 0.948000 0.892000 0.412000 0.996000 0.817000 0.0 -1.524000 0.0 0.0 250.406 1.000 0.000 0.000 -4.929 -14.156 2.115 23.549 1.0 0.002 1.0 0.0 0.0 1.0 0.001 255.0 255.0 200.0 249.0 255.0 0.0 169.0 3.0 175.0 193.0 226.0 77.0 255.0 254.0 215.0 7.762 2.946 -6.912 -4.915 2.126 90.0 0.820000 0.985000 0.699000 0.923000 0.984000 0.604000 0.992000 0.858000 0.0 0.250000 0.0 0.0

297232 rows × 96 columns

train_dataframe description

In [12]:
print("#### X_dataframe Description #### \n")

print("DIMENSION")
print("- " + str(X_dataframe.shape[1]) + ' features\n' + "- " + str(X_dataframe.shape[0]) + ' observations\n')

print("COLUMNS TYPES")
print(X_dataframe.dtypes)

print("\nMISSING VALUES")
print('- ' + str(X_dataframe.isna().sum().sum()) + ' missing values\n')

print("BINARY VARIABLES")
for i in range(X_dataframe.shape[1]):
    X_unique = X_dataframe["col_"+str(i)].unique().tolist()
    if len(X_unique) == 2:
        print("- col_" + str(i) + " is a binary variables : ",  X_unique)
#### X_dataframe Description #### 

DIMENSION
- 96 features
- 297232 observations

COLUMNS TYPES
col_0     float64
col_1     float64
col_2     float64
col_3     float64
col_4     float64
col_5     float64
col_6     float64
col_7     float64
col_8     float64
col_9     float64
col_10    float64
col_11    float64
col_12    float64
col_13    float64
col_14    float64
col_15    float64
col_16    float64
col_17    float64
col_18    float64
col_19    float64
col_20    float64
col_21    float64
col_22    float64
col_23    float64
col_24    float64
col_25    float64
col_26    float64
col_27    float64
col_28    float64
col_29    float64
col_30    float64
col_31    float64
col_32    float64
col_33    float64
col_34    float64
col_35    float64
col_36    float64
col_37    float64
col_38    float64
col_39    float64
col_40    float64
col_41    float64
col_42    float64
col_43    float64
col_44    float64
col_45    float64
col_46    float64
col_47    float64
col_48    float64
col_49    float64
col_50    float64
col_51    float64
col_52    float64
col_53    float64
col_54    float64
col_55    float64
col_56    float64
col_57    float64
col_58    float64
col_59    float64
col_60    float64
col_61    float64
col_62    float64
col_63    float64
col_64    float64
col_65    float64
col_66    float64
col_67    float64
col_68    float64
col_69    float64
col_70    float64
col_71    float64
col_72    float64
col_73    float64
col_74    float64
col_75    float64
col_76    float64
col_77    float64
col_78    float64
col_79    float64
col_80    float64
col_81    float64
col_82    float64
col_83    float64
col_84    float64
col_85    float64
col_86    float64
col_87    float64
col_88    float64
col_89    float64
col_90    float64
col_91    float64
col_92    float64
col_93    float64
col_94    float64
col_95    float64
dtype: object

MISSING VALUES
- 0 missing values

BINARY VARIABLES
- col_20 is a binary variables :  [0.0, 255.0]
- col_68 is a binary variables :  [0.0, 255.0]

We notice that all the columns are of type float.

The dataset is composed of 96 features for a total of 297 232 observations.

We notice that the number of features is very important close to (100 features). We can easily suppose that all these features will not be relevant for the training part of a machine learning model. Indeed, since machine learning has no understanding of causality, the models try to map any feature included in their dataset to the target variable, even if there is no causal relationship. This can lead to inaccurate and erroneous models. In fact having too many features can confuse certain machine learning algorithms such as clustering algorithms. To do so, it will be necessary to apply dimensionality reduction tools to reduce the cost of training the model and to solve complex problems with simple models.

Moreover the dataset does not contain any missing values (NaN values)

Finally, we notice that the 20th column and the 68th column of the dataset are binary variables with the value {0.0, 255.0}.

Re-ecoding of binary variable as 1 or 0

  • 0 if x = 0.0
  • 1 if x = 255.0
In [13]:
X_dataframe.loc[X_dataframe['col_20'] == 0.0, 'col_20'] = 0
X_dataframe.loc[X_dataframe['col_20'] == 255.0, 'col_20'] = 1
X_dataframe['col_20'] = X_dataframe['col_20'].astype(int)

X_dataframe.loc[X_dataframe['col_68'] == 0.0, 'col_68'] = 0
X_dataframe.loc[X_dataframe['col_68'] == 255.0, 'col_68'] = 1
X_dataframe['col_68'] = X_dataframe['col_68'].astype(int)

Analysis of the two templates

Visualization of the first and last columns of two templates

In [14]:
X_data = pd.DataFrame(X_dataframe[["col_0", "col_48","col_47", "col_95"]])
X_data
Out[14]:
col_0 col_48 col_47 col_95
0 236.031 238.253 0.0 0.0
1 228.040 107.160 0.0 0.0
2 158.310 214.620 0.0 0.0
3 165.464 173.099 0.0 0.0
4 153.727 201.940 0.0 0.0
... ... ... ... ...
297227 77.865 97.226 0.0 0.0
297228 147.250 153.420 0.0 0.0
297229 219.495 193.696 0.0 0.0
297230 111.054 63.577 0.0 0.0
297231 221.947 250.406 0.0 0.0

297232 rows × 4 columns

As indicated in the statement of the data challenge, the dataframe is constituted by the concatenation of templates each of which has a dimension of 48.

By visualizing the first and the last column of each template we notice that the order of magnitude between the 2 templates seem to be verified.

The 1st template is a concatenation of the columns col_0 to col_47.

The 2nd template corresponds to the concatenation of the columns col_48 to col_95.

So the dataframe is compliant to the data challenge description

Split the dataframe as two template

In [15]:
X_template_1 = X_dataframe.iloc[:,0:48]
X_template_2 = X_dataframe.iloc[:,48:96]

Description of template_1

In [16]:
print("--------------- TEMPLATE 1 DESCRIPTION ---------------\n")
round(X_template_1.describe(),2)
--------------- TEMPLATE 1 DESCRIPTION ---------------

Out[16]:
col_0 col_1 col_2 col_3 col_4 col_5 col_6 col_7 col_8 col_9 col_10 col_11 col_12 col_13 col_14 col_15 col_16 col_17 col_18 col_19 col_20 col_21 col_22 col_23 col_24 col_25 col_26 col_27 col_28 col_29 col_30 col_31 col_32 col_33 col_34 col_35 col_36 col_37 col_38 col_39 col_40 col_41 col_42 col_43 col_44 col_45 col_46 col_47
count 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.0 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00
mean 170.77 0.99 0.00 0.00 -0.18 -4.89 -0.14 35.22 0.48 0.06 1.00 0.84 0.04 0.13 0.02 252.82 250.83 225.47 236.86 224.14 0.5 180.60 94.56 171.47 174.52 199.69 185.53 253.72 250.93 208.66 2.20 1.69 -0.03 2.53 0.25 100.07 0.85 0.93 0.79 0.89 0.85 0.58 0.98 0.84 0.00 1.18 0.04 0.01
std 50.11 0.06 0.04 0.04 10.67 8.36 2.88 14.85 0.50 0.23 0.01 0.36 0.18 0.32 0.12 15.01 17.99 38.04 22.12 62.20 0.5 43.84 89.23 16.20 48.59 29.80 106.03 9.32 17.81 49.37 6.13 0.79 12.37 7.72 3.14 19.18 0.11 0.15 0.15 0.13 0.19 0.21 0.04 0.06 0.01 2.54 0.21 0.10
min -244.76 0.00 0.00 0.00 -58.52 -49.55 -39.96 4.96 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.0 110.00 0.00 51.00 0.00 0.00 0.00 0.00 0.00 0.00 -22.39 1.01 -84.40 -83.92 -51.95 24.00 -0.17 -0.07 -0.11 -0.17 -0.12 -0.02 -0.15 0.45 0.00 -6.73 0.00 0.00
25% 137.73 1.00 0.00 0.00 -2.26 -11.27 -1.41 23.65 0.00 0.00 1.00 1.00 0.00 0.00 0.00 255.00 255.00 192.00 226.00 227.00 0.0 134.00 0.00 175.00 189.00 185.00 83.00 255.00 254.00 189.00 -1.73 1.09 -2.98 -2.14 -1.13 85.00 0.80 0.98 0.71 0.85 0.82 0.40 0.99 0.82 0.00 -0.60 0.00 0.00
50% 173.17 1.00 0.00 0.00 -0.01 -3.53 -0.18 31.82 0.00 0.00 1.00 1.00 0.00 0.00 0.00 255.00 255.00 255.00 245.00 255.00 0.0 206.00 84.00 175.00 194.00 212.00 254.00 255.00 254.00 226.00 2.98 1.40 0.23 1.63 0.26 94.00 0.87 0.99 0.84 0.94 0.92 0.53 0.99 0.85 0.00 2.42 0.00 0.00
75% 205.99 1.00 0.00 0.00 2.11 1.38 1.04 46.38 1.00 0.00 1.00 1.00 0.00 0.00 0.00 255.00 255.00 255.00 255.00 255.00 1.0 220.00 173.00 175.00 194.00 222.00 254.00 255.00 254.00 242.00 6.73 2.04 3.32 6.28 1.59 117.00 0.93 0.99 0.91 0.98 0.95 0.80 0.99 0.88 0.00 3.00 0.00 0.00
max 411.36 1.00 1.00 1.00 60.44 24.78 28.20 88.78 1.00 1.00 1.00 1.00 1.00 1.00 1.00 255.00 255.00 255.00 255.00 255.00 1.0 220.00 255.00 175.00 194.00 230.00 255.00 255.00 255.00 254.00 20.64 7.00 79.53 64.22 82.44 184.00 1.03 1.04 1.01 1.04 1.07 1.02 1.03 0.97 1.00 5.34 1.00 1.00

Description of template_2

In [17]:
print("--------------- TEMPLATE 2 DESCRIPTION ---------------\n")
round(X_template_2.describe(),2)
--------------- TEMPLATE 2 DESCRIPTION ---------------

Out[17]:
col_48 col_49 col_50 col_51 col_52 col_53 col_54 col_55 col_56 col_57 col_58 col_59 col_60 col_61 col_62 col_63 col_64 col_65 col_66 col_67 col_68 col_69 col_70 col_71 col_72 col_73 col_74 col_75 col_76 col_77 col_78 col_79 col_80 col_81 col_82 col_83 col_84 col_85 col_86 col_87 col_88 col_89 col_90 col_91 col_92 col_93 col_94 col_95
count 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00 297232.00
mean 159.04 0.99 0.01 0.01 0.27 -5.97 -0.36 36.53 0.49 0.10 1.00 0.81 0.04 0.15 0.04 232.36 183.15 132.86 229.39 138.81 0.41 164.84 114.15 172.05 166.59 192.99 173.56 252.79 248.51 199.71 0.16 1.94 0.53 2.11 0.12 76.61 0.83 0.91 0.73 0.87 0.80 0.53 0.98 0.81 0.00 0.78 0.08 0.02
std 56.51 0.08 0.06 0.05 11.55 9.65 3.57 14.66 0.50 0.29 0.06 0.38 0.18 0.35 0.18 49.43 73.69 89.55 28.66 111.45 0.49 40.80 96.12 15.21 56.83 30.46 108.45 12.85 24.83 49.75 6.37 0.71 13.26 9.27 4.51 17.38 0.12 0.19 0.15 0.18 0.24 0.22 0.07 0.07 0.05 2.44 0.27 0.12
min -274.97 0.00 0.00 0.00 -81.81 -59.69 -51.77 3.78 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 110.00 0.00 51.00 0.00 0.00 0.00 0.00 0.00 0.00 -22.30 1.02 -81.87 -70.99 -142.70 1.00 -0.18 -0.09 -0.16 -0.17 -0.08 0.00 -0.18 0.32 0.00 -6.64 0.00 0.00
25% 126.87 1.00 0.00 0.00 -3.34 -13.66 -1.97 25.22 0.00 0.00 1.00 1.00 0.00 0.00 0.00 255.00 91.00 4.00 213.00 0.00 0.00 121.00 0.00 175.00 177.00 173.00 29.00 255.00 254.00 178.00 -4.06 1.52 -4.13 -3.67 -1.60 59.00 0.78 0.96 0.62 0.84 0.72 0.35 0.98 0.78 0.00 -1.12 0.00 0.00
50% 161.87 1.00 0.00 0.00 0.29 -5.20 -0.29 32.99 0.00 0.00 1.00 1.00 0.00 0.00 0.00 255.00 178.00 169.00 239.00 171.00 0.00 164.00 124.00 175.00 193.00 203.00 250.00 255.00 254.00 213.00 0.80 1.85 0.74 0.99 0.26 82.00 0.86 0.99 0.75 0.95 0.91 0.45 0.99 0.82 0.00 1.83 0.00 0.00
75% 194.94 1.00 0.00 0.00 4.45 1.67 1.28 47.56 1.00 0.00 1.00 1.00 0.00 0.00 0.00 255.00 255.00 194.00 255.00 255.00 1.00 208.00 205.00 175.00 194.00 218.00 254.00 255.00 254.00 235.00 4.86 2.12 6.04 6.81 2.00 88.00 0.92 0.99 0.85 0.98 0.98 0.78 0.99 0.86 0.00 2.71 0.00 0.00
max 452.42 1.00 1.00 1.00 74.49 37.22 41.84 88.86 1.00 1.00 1.00 1.00 1.00 1.00 1.00 255.00 255.00 255.00 255.00 255.00 1.00 220.00 255.00 175.00 194.00 230.00 254.00 255.00 255.00 254.00 20.01 7.00 79.79 129.62 87.13 213.00 1.03 1.03 1.01 1.05 1.05 1.01 1.05 0.96 1.00 5.13 1.00 1.00

Overall, the features seem to be quite clearly heterogeneous. Indeed, the scales of the quantitative features seem to vary from one variable to another. This can be observed by observing the variations of the value of the standard deviation for each features. This trend can be observed both for the dataframe of template_1 and for the dataframe of template_2. For example, the template_1 variable, col_0 has a standard deviation of 50.11 while the template_1 variables col_2 and col_7 have a standard deviation of 0.04 and 14.85 respectively. Therefore, the variable col_0 has a much higher value than the other features of the dataset. However, we notice that globally for most of the variables, the standard deviation fluctuates around 0.

Furthemore, the description of the dataset of the 2 templates seems to present a rather clear homogeneity. By comparing side by side, the values of the standard deviation and the average we notice that they are quite close. In any case at least the mathematical order of magnitude and the amplitude seems to be confirmed.

As a comparison, to take only these two variables the variable col_7 of template_1 and the variable col_55 of template_2 have respectively a standard deviation and a mean very close:

  • Mean comparison : 35.22 (template_1) and 36.56 (template_2) => (1.34 difference)

  • Standard deviation comparison : 14.85 (template_1) and 14.66 (template_2) => (0.19 difference)

We can also note that some variables have rather high amplitudes of values with a standard deviation higher than 100 as it is the case for the variables col_15 to col_19 for template_1 and col_63 to col_67. This again legitimates a possible standardization of the features before their use for machine learning purposes.

To sum up, overall, there are clear quantitative aspects to these characteristics, but it is difficult to draw an intuitive interpretation. As it is, given that the scales of the variables are very disparate, there is no doubt that it is necessary to rescale so that the variables can be compared on a common scale. Before applying a machine learning algorithm it will be necessary to normalize the variables.

Data Analysis for train labels dataset

Visualization of labels

In [18]:
y_dataframe.head(10)
Out[18]:
0
0 1
1 0
2 0
3 1
4 0
5 1
6 1
7 0
8 0
9 1

At first sight we notice that the labels seem to be binary. Indeed, the values taken by the target variable are 1 or 0. Now the next step is to confirm that by carrying out a description of the train labels dataset.

Rename columns

In [19]:
columns_list_labels = list()
for i in range(y_dataframe.shape[1]):
    name_col_label = "label_"+str(i)
    columns_list_labels.append(name_col_label)

y_dataframe.columns = columns_list_labels
    
y_dataframe
Out[19]:
label_0
0 1
1 0
2 0
3 1
4 0
... ...
297227 0
297228 0
297229 1
297230 0
297231 1

297232 rows × 1 columns

Train labels dataframe description

In [20]:
print("#### y_dataframe Description #### \n")

print("DIMENSION")
print("- " + str(y_dataframe.shape[1]) + ' features\n' + "- " + str(y_dataframe.shape[0]) + ' observations\n')

print("COLUMNS TYPES")
print(X_dataframe.dtypes)

print("\nMISSING VALUES")
print('- ' + str(y_dataframe.isna().sum().sum()) + ' missing values\n')

print("BINARY VARIABLES")
for i in range(y_dataframe.shape[1]):
    y_unique = y_dataframe["label_"+str(i)].unique().tolist()
    if len(y_unique) == 2:
        print("- label_" + str(i) + " is a binary variables : ",  y_unique)
#### y_dataframe Description #### 

DIMENSION
- 1 features
- 297232 observations

COLUMNS TYPES
col_0     float64
col_1     float64
col_2     float64
col_3     float64
col_4     float64
col_5     float64
col_6     float64
col_7     float64
col_8     float64
col_9     float64
col_10    float64
col_11    float64
col_12    float64
col_13    float64
col_14    float64
col_15    float64
col_16    float64
col_17    float64
col_18    float64
col_19    float64
col_20      int32
col_21    float64
col_22    float64
col_23    float64
col_24    float64
col_25    float64
col_26    float64
col_27    float64
col_28    float64
col_29    float64
col_30    float64
col_31    float64
col_32    float64
col_33    float64
col_34    float64
col_35    float64
col_36    float64
col_37    float64
col_38    float64
col_39    float64
col_40    float64
col_41    float64
col_42    float64
col_43    float64
col_44    float64
col_45    float64
col_46    float64
col_47    float64
col_48    float64
col_49    float64
col_50    float64
col_51    float64
col_52    float64
col_53    float64
col_54    float64
col_55    float64
col_56    float64
col_57    float64
col_58    float64
col_59    float64
col_60    float64
col_61    float64
col_62    float64
col_63    float64
col_64    float64
col_65    float64
col_66    float64
col_67    float64
col_68      int32
col_69    float64
col_70    float64
col_71    float64
col_72    float64
col_73    float64
col_74    float64
col_75    float64
col_76    float64
col_77    float64
col_78    float64
col_79    float64
col_80    float64
col_81    float64
col_82    float64
col_83    float64
col_84    float64
col_85    float64
col_86    float64
col_87    float64
col_88    float64
col_89    float64
col_90    float64
col_91    float64
col_92    float64
col_93    float64
col_94    float64
col_95    float64
dtype: object

MISSING VALUES
- 0 missing values

BINARY VARIABLES
- label_0 is a binary variables :  [1, 0]

The training labels dataset is composed of a single feature that corresponds to the target variable, (i.e. the variable that the machine learning model will try to predict).

We also notice that the train_labels datasets has the same number of rows (297232 rows). If this had not been the case, part of the training set data could not have been used to train the machine learning models.

Moreover train_labels do not have any missing values (i.e. NaN values).

Finally we can now confirm that the target variable is a binary variable composed of two classes {0, 1}. The values are of type python float64, thus it is necessary to convert them into integer (i.e. python int64).

Convert train_labels from float to int

In [21]:
y_dataframe['label_0'] = y_dataframe['label_0'].astype(int)

train_labels distribution analysis

In [22]:
bar_plot = sns.histplot(data=y_dataframe, x="label_0").set_title('Labels Distribution', fontsize = 15)
counter_values = y_dataframe['label_0'].value_counts()
print(counter_values)
1    148616
0    148616
Name: label_0, dtype: int64

Concerning the distribution of the train labels we notice that the distribution is perfectly balanced. This is a good point because we will not need to use resampling techniques such as SMOTE to work with a balanced dataset

Split the train set and train labels set into train set and validation set

  • train sample dataset contrains 90% of train dataset
  • validation sample dataset contains 10% train dataset
In [23]:
X_train, X_valid, y_train, y_valid = train_test_split(X_dataframe, y_dataframe, test_size=0.1)

Correlation analysis

In [24]:
correlations = X_train.corr()
fig_1 = plt.figure(figsize=(30, 20))
sns.heatmap(correlations, xticklabels=correlations.columns, yticklabels=correlations.columns, 
            cmap='YlGnBu')
plt.title('Heatmap for features correlation\n', fontsize=30)
plt.show()

This heatmap allows to visualize the covariance of the different features of the dataset. If two features are strongly correlated to each other, they will have a similar effect on the target variable. Therefore it will not be necessary to include the two strongly correlated features durint the training phase of a machine learning model. Thus it is appropriate to remove one of them without having a negative impact on the accuracy of the model predictions

Given the number of features in our dataset, it is difficult to visually determine each of the correlated variables. However, we can clearly distinguish 2 phenomena that emerge from the heatmap. Indeed, this heatmap reveals two striking phenomena.

First, we notice that some features are positively correlated and others are negatively correlated.

Secondly, we can see that the features specific to each template seem to be more positively correlated between them. On the other hand, when we cross the 48 features of template_1 with the 48 features of template_2 we notice that they tend to be more negatively correlated.

It is now necessary to verify these hypotheses by visualizing more closely the correlation.

Heatmap correlation for template_1

In [25]:
correlations_temp1 = X_train.iloc[:,0:48].corr()
fig_1 = plt.figure(figsize=(30, 20))
sns.heatmap(correlations_temp1, xticklabels=correlations_temp1.columns, yticklabels=correlations_temp1.columns, 
            cmap='YlGnBu', annot=True, linewidths=.5)
plt.title('Heatmap for Template 1 features correlation\n', fontsize=30)
plt.show()

Through this heatmap we are better able to analyze with finer granularity, the features correlation of template_1.

By focusing our analysis on the 48 features of template_1 we can show the value of the correlation between each feature (outside the diagonal).

Through the observation of this heatmap, the first hypothesis that we previously stated seems to be confirmed . Indeed, we can see that the features are more positively correlated between them. For example the features {col_17, col_20} with a correlation coefficient of 0.76 are strongly positively correlated.

Moreover, we also observe that some features are strongly negatively correlated even though this phenomenon occurs less frequently. For example the features {col_2, col_1} and {col_3, col_1} with respectively a correlation coefficient of -0.75 and -0.68 are strongly negatively correlated.

Heatmap correlation for template_2

In [26]:
correlations_temp2 = X_train.iloc[:,48:].corr()
fig_2 = plt.figure(figsize=(30, 20))
sns.heatmap(correlations_temp2, xticklabels=correlations_temp2.columns, yticklabels=correlations_temp2.columns, 
            cmap='YlGnBu', annot=True, linewidths=.5)
plt.title('Heatmap for Template 2 features correlation\n', fontsize=30)
plt.show()

As the previous heatmap specific to template_1, through this heatmap we are more able to analyze with finer granularity, the correlation of the variables of template_2.

In a similar way to the heatmap of template_1, by focusing our analysis on the 48 features of template_2 we can observe the value of the correlation between each feature (outside the diagonal).

Through the observation of this heatmap of template_2, it is more complicated to validate the first hypothesis that we made previously. Indeed, the difference in proportion between the number of positively correlated features and the number of negatively correlated features seems to be less obvious.

Some features are strongly positively correlated like the features {col_67, col_65} with a correlation coefficient a 0.85. On the other hand, features like {col_68, col_64} with a correlation coefficient of -0.85 are strongly negatively correlated.

Overall conclusion

Anyway the distribution of the correlated features does not seem to follow a real logic. Unfortunately at this stage it is not possible to provide a viable interpretation of the correlation of some variables as it is so random.

PCA for visualization

The objective of PCA is to simplify the model while retaining as much information as possible. To do this, in order to keep as much information as possible, PCA will try to find the best axis to project the data in a smaller dimension by maximizing the variance. The axis that will maximize the variance is the axis that is closest to all the data points.

In [27]:
nb = 50000 # analyse the first 50 000 observations
X_pca = X_train[:nb].to_numpy()
y_pca = y_train[:nb].to_numpy()

mask = (y_pca==1).flatten()
pca = PCA(n_components=2)
pca_components = pca.fit_transform(X_pca)

label1 = pca_components[mask]
label2 = pca_components[~mask]

fig_1 = plt.figure(figsize=(20,16))

plt.scatter(label1[:,0], label1[:,1], c="lightsalmon", alpha=0.5, s=2, label="y = 1")
plt.scatter(label2[:,0], label2[:,1], c="lightskyblue", alpha=0.5, s=2, label="y = 0")

plt.xlim([-400, 400])
plt.xlabel("component 1", fontsize = 15)

plt.ylim([-230, 350])
plt.ylabel("component 2", fontsize = 15)

plt.title('PCA on the model features\n', fontsize = 30)
plt.grid(True)
plt.legend(fontsize = 15)
plt.show()

At first glance the data does not seem to be separable at all. We expected that there would be clearly 2 distinct compact groups for each label value. Indeed, whether it is for the class y=0 or class y=1, these are very large and spread out.

However, the separation of the data of each class is not clearly visible. Hence, the PCA reveals that the most difficult part of this data challenge will be to correctly identify the pairs of matching images, which will tend to be identified as non-matching pairs.

Verification of possible redundant data points

Throughout this data exploration phase, we can easily see that the features of each template seem to follow a very similar structure. This suggests that the data set of features col_0 to col_47 (features template_1) or col_48 to col_95 (features template_2) uniquely represent an image. Therefore we can deduce that the features describing template_1 and the features describing template_2 should logically not share the same values.

Although this is a rather strong assumption, it is necessary to check the accuracy of this assumption by performing a duplicate analysis in the dataset.

Indeed, if it proves to be correct and the same pairs of images appear on the training set and the test set, the information on the training set can be directly exploited to produce predictions on the test set.

In [28]:
# drop duplicated elements in the train set 
# if at least two rows contains the same values one of these rows is rejected from the train set
nb_template1_train = X_train.iloc[:,0:48].drop_duplicates()
nb_template2_train = X_train.iloc[:,48:].drop_duplicates()
print("---- DISTINCTS VALUES TRAIN SET ----")
print("Train set size :", X_train.shape[0])
print("- " + str(nb_template1_train.shape[0]) + " distinct value(s) for template 1 in the train set (i.e " + str(round((nb_template1_train.shape[0] / X_train.shape[0])*100, 3)) + "% of train set size)")
print("- " + str(nb_template2_train.shape[0]) + " distinct value(s) for template 2 in the train set (i.e " + str(round((nb_template2_train.shape[0] / X_train.shape[0])*100, 3)) + "% of train set size)\n")

# drop duplicated elements in the validation set 
# if at least two rows contains the same values one of these rows is rejected from the validation set
nb_template1_valid = X_valid.iloc[:,0:48].drop_duplicates()
nb_template2_valid = X_valid.iloc[:,48:].drop_duplicates()
print("---- DISTINCTS VALUES VALIDATION SET ----")
print("Validation set size :", X_valid.shape[0])
print("- " + str(nb_template1_valid.shape[0]) + " distinct values for template 1 in the validation set (i.e " + str(round((nb_template1_valid.shape[0] / X_valid.shape[0])*100, 3)) + "% of validation set size)")
print("- " + str(nb_template2_valid.shape[0]) + " distinct values for template 2 in the validation set (i.e " + str(round((nb_template2_valid.shape[0] / X_valid.shape[0])*100, 3)) + "% of validation set size)\n")

# check if there are template 1 on the train set that are the same as template 1 on the test set.
list_template1_train = np.asarray(X_train.iloc[:,0:48].drop_duplicates().values.tolist())
list_template1_valid = np.asarray(X_valid.iloc[:,0:48].drop_duplicates().values.tolist())
template1_train_set = set([tuple(x) for x in list_template1_train])
template1_valid_set = set([tuple(x) for x in list_template1_valid])
intersection_template1 = np.array([x for x in template1_train_set & template1_valid_set])
nb_intersection = intersection_template1.shape[0]
print("\n---- SAME TEMPLATE 1 BETWEEN TRAIN SET AND VALIDATION SET ----")
print("- " + str(nb_intersection) + " template-1 value(s) in the valid set that are also part of the train set.")

# check if there are template 2 on the train set that are the same as template 2 on the test set.
list_template2_train = np.asarray(X_train.iloc[:,48:].drop_duplicates().values.tolist())
list_template2_valid = np.asarray(X_valid.iloc[:,48:].drop_duplicates().values.tolist())
template2_train_set = set([tuple(x) for x in list_template2_train])
template2_valid_set = set([tuple(x) for x in list_template2_valid])
intersection_template2 = np.array([x for x in template2_train_set & template2_valid_set])
nb_intersection = intersection_template2.shape[0]
print("\n---- SAME TEMPLATE 2 BETWEEN TRAIN SET AND VALIDATION SET ----")
print("- " + str(nb_intersection) + " template-2 value(s) in the valid set that are also part of the train set.")
---- DISTINCTS VALUES TRAIN SET ----
Train set size : 267508
- 267459 distinct value(s) for template 1 in the train set (i.e 99.982% of train set size)
- 267492 distinct value(s) for template 2 in the train set (i.e 99.994% of train set size)

---- DISTINCTS VALUES VALIDATION SET ----
Validation set size : 29724
- 29723 distinct values for template 1 in the validation set (i.e 99.997% of validation set size)
- 29724 distinct values for template 2 in the validation set (i.e 100.0% of validation set size)


---- SAME TEMPLATE 1 BETWEEN TRAIN SET AND VALIDATION SET ----
- 4 template-1 value(s) in the valid set that are also part of the train set.

---- SAME TEMPLATE 2 BETWEEN TRAIN SET AND VALIDATION SET ----
- 1 template-2 value(s) in the valid set that are also part of the train set.

Concerning the training dataset we notice that almost all the feature values of template_1 and template_2 are distinct. Indeed, 99.98 % of the training set data describing template_1 are distinct. Moreover 99.99 % of the data of the training set describing template_2 are distinct. Therefore, we can say that our dataset contains almost no duplicate data. This means that the set of images in template_1 and template_2 have many images that are different from each other.

Moreover, the important thing now is to know if the sets of images template_1 and template_2 contained in the training dataset also contain images identical to the validation dataset. If this is the case, then this information can be exploited with profit. In our case, the results obtained show that on the one hand, 4 images of the template_1 set of the validation dataset are also contained in the training dataset and on the other hand 1 image of the template_2 set of the validation dataset are also contained in the training dataset. Therefore, the strategy of identifying and working with images common to the training and test sets does not seem to be profitable and efficient here. In conclusion, this strategy will not be pursued.

-------- Data Preprocessing --------

Load and Preprocessing function

Through this exploratory phase, it is appropriate to centralize the different pre-processing operations that we have previously carried out within a unique function.

In [29]:
def preprocessing():
    """
    Preprocessed dataframe
    ---- PARAMETERS ----
    Input : None
    Return : train dataframe cleaned, train labels cleaned
    """
    X, y = np.load("train_data.npy"), extract_labels("train_labels.txt")
    
    # convert X to DataFrame
    X_copied = X.copy()
    X_dataframe = pd.DataFrame(X_copied)
    
    # convert y to DataFrame
    y_copied = y.copy()
    y_dataframe = pd.DataFrame(y_copied)
    
    # rename X_dataframe columns
    columns_list = list()
    for i in range(X_dataframe.shape[1]):
        name_col = "col_"+str(i)
        columns_list.append(name_col)
    X_dataframe.columns = columns_list

    # rename y_dataframe columns    
    columns_list_labels = list()
    for i in range(y_dataframe.shape[1]):
        name_col_label = "label_"+str(i)
        columns_list_labels.append(name_col_label)
    y_dataframe.columns = columns_list_labels
    
    # preprocessing X_dataframe 
    X_dataframe.loc[X_dataframe['col_20'] == 0.0, 'col_20'] = 0
    X_dataframe.loc[X_dataframe['col_20'] == 255.0, 'col_20'] = 1
    X_dataframe['col_20'] = X_dataframe['col_20'].astype(int)

    X_dataframe.loc[X_dataframe['col_68'] == 0.0, 'col_68'] = 0
    X_dataframe.loc[X_dataframe['col_68'] == 255.0, 'col_68'] = 1
    X_dataframe['col_68'] = X_dataframe['col_68'].astype(int)
    
    # convert labels value from float to int
    y_dataframe['label_0'] = y_dataframe['label_0'].astype(int)
    
    # select index duplicated row for template 1
    X_template_1 = X_dataframe.iloc[:,0:48]
    template1_dup_rows = X_template_1[X_template_1.duplicated(keep = "first")]
    template1_dup_rows_tuple_idx = template1_dup_rows.groupby(list(template1_dup_rows)).apply(lambda x: tuple(x.index)).tolist()
    template1_dup_row_idx = [item for t in template1_dup_rows_tuple_idx for item in t]
    
    # select index duplicated row for template 2
    X_template_2 = X_dataframe.iloc[:,48:96]
    template2_dup_rows = X_template_2[X_template_2.duplicated(keep = "first")]
    template2_dup_rows_tuple_idx = template2_dup_rows.groupby(list(template2_dup_rows)).apply(lambda x: tuple(x.index)).tolist()
    template2_dup_row_idx = [item for t in template2_dup_rows_tuple_idx for item in t]
    
    dup_idx_list = template1_dup_row_idx + template2_dup_row_idx
    
    # drop duplicated rows for train dataset
    X_dataframe.drop(dup_idx_list, axis=0, inplace=True)
    
    # drop duplicated rows for train labels dataset
    y_dataframe.drop(dup_idx_list, axis=0, inplace=True)
    
    return X_dataframe, y_dataframe

X_df_cleaned, y_df_cleaned = preprocessing()

Checking distrinct values for each template after preprocessing

In [30]:
# drop duplicated elements in the train set 
# if at least two rows contains the same values one of these rows is rejected from the train set
nb_template1_train = X_df_cleaned.iloc[:,0:48].drop_duplicates()
nb_template2_train = X_df_cleaned.iloc[:,48:].drop_duplicates()
print("---- Checking DISTINCTS VALUES X_DataFrame after preprocessing ----")
print("Train set size :", X_df_cleaned.shape[0])
print("- " + str(nb_template1_train.shape[0]) + " distinct value(s) for template 1 in the X_DataFrame (i.e " + str(round((nb_template1_train.shape[0] / X_df_cleaned.shape[0])*100, 3)) + "% of train set size)")
print("- " + str(nb_template2_train.shape[0]) + " distinct value(s) for template 2 in the X_DataFrame (i.e " + str(round((nb_template2_train.shape[0] / X_df_cleaned.shape[0])*100, 3)) + "% of train set size)\n")
---- Checking DISTINCTS VALUES X_DataFrame after preprocessing ----
Train set size : 297178
- 297178 distinct value(s) for template 1 in the X_DataFrame (i.e 100.0% of train set size)
- 297178 distinct value(s) for template 2 in the X_DataFrame (i.e 100.0% of train set size)

The preprocessing phase seems to have done the job. We can see that all the duplicate images for template_1 and template_2 have been removed. We obtain a dataset cleaned of all duplicate elements

Analyze Train labels distribution after preprocessing

In [31]:
bar_plot = sns.histplot(data=y_df_cleaned, x="label_0").set_title('Labels Distribution after Preprocessing', fontsize=15)
counter_values_cleaned = y_df_cleaned['label_0'].value_counts()
print(counter_values_cleaned)
1    148610
0    148568
Name: label_0, dtype: int64

After the preprocessing phase, we notice that the distribution of the label values is no longer perfectly balanced. Indeed, we can observe that the label y=1 is slightly more represented than the label y=0 but this imbalance remains annecdotal compared to the size of the dataset.

Check train set length and train_labels set length

In [32]:
print("Train dataset length: ", len(X_df_cleaned))
print("Train labels dataset length: ", len(y_df_cleaned))
Train dataset length:  297178
Train labels dataset length:  297178

We can see that the preprocessing has been applied because the train dataset and the train labels dataset have the same length. Tha'ts a good point.

We can now elaborate the strategic approach we are going to address for the implementation and execution of machine learning algorithms

-------- Feature Selection --------

We notice that the dataset has a lot of features nearly close to 100 features. We also noticed earlier with the correlation analysis of the variables that some variables are particularly strong correlated. Therefore it is advisable to remove the weakly correlated variables in order to simplify the model and reduce the cost of learning the model. However, this may not be enough, it is possible that once the correlated variables are discarded from the dataset, it is necessary to further restrict the number of features in order to work only with a subset of relevant variables. This is the challenge of the features selection step. We try to minimize the loss of information coming from the deletion of all the other variables while simplifying the classification task that the machine learning model will have to perform.

Feature Selection strategy

The goal of feature selection in machine learning is to find the best set of features to build useful models of the studied phenomena.

To do that the strategy is to apply 3 feature selection algorithms which are :

  • Featurewiz
  • ElasticNet
  • Random Forest Feature Importance

Basis on the features selected by these 3 algorithms I select the 25 best features for each algorithms. Then I select features that are selected by at least 2 different algorithms will be use to train the machine learning models.

In fact, the idea is to compare the results of each algorithm, in order to select the 25 most important features of our dataset in the most robust way.

Rescale all features except binary variables

Normalize X_df_cleaned

In [33]:
list_binary_columns = list()
for i in X_df_cleaned.columns:
    if len(X_df_cleaned[i].unique()) < 3: 
        list_binary_columns.append(i)

list_binary_columns
Out[33]:
['col_20', 'col_68']
In [34]:
X_df_copied = X_df_cleaned.copy()
X_raw_df_cleaned = pd.DataFrame(X_df_copied)

for col in X_df_cleaned.columns:
    if col not in list_binary_columns:
        X_df_cleaned[col] = (X_df_cleaned[col] - X_df_cleaned[col].mean()) / X_df_cleaned[col].std()
        
X_df_cleaned
Out[34]:
col_0 col_1 col_2 col_3 col_4 col_5 col_6 col_7 col_8 col_9 col_10 col_11 col_12 col_13 col_14 col_15 col_16 col_17 col_18 col_19 col_20 col_21 col_22 col_23 col_24 col_25 col_26 col_27 col_28 col_29 col_30 col_31 col_32 col_33 col_34 col_35 col_36 col_37 col_38 col_39 col_40 col_41 col_42 col_43 col_44 col_45 col_46 col_47 col_48 col_49 col_50 col_51 col_52 col_53 col_54 col_55 col_56 col_57 col_58 col_59 col_60 col_61 col_62 col_63 col_64 col_65 col_66 col_67 col_68 col_69 col_70 col_71 col_72 col_73 col_74 col_75 col_76 col_77 col_78 col_79 col_80 col_81 col_82 col_83 col_84 col_85 col_86 col_87 col_88 col_89 col_90 col_91 col_92 col_93 col_94 col_95
0 1.302253 0.132587 -0.100904 -0.089055 -0.183374 0.202542 0.428353 -0.767115 1.041148 -0.258372 0.008961 0.444393 -0.196754 -0.385971 -0.148440 0.145382 0.231715 -0.853406 0.820022 -1.031166 0 0.830449 -0.174367 0.217755 0.400903 0.715282 0.645788 0.137490 0.172125 0.736075 0.915602 3.127295 -0.030674 0.017933 0.381966 -0.837885 0.874914 0.360028 -1.176633 0.779463 -0.821182 -0.965149 0.347332 -0.615637 -0.016694 0.727583 -0.217292 -0.105722 1.401711 0.159267 -0.124400 -0.108598 2.255700 -0.396098 -1.360661 -0.677214 1.016461 -0.346288 0.065147 0.485437 -0.201068 -0.426977 -0.218298 0.458053 0.974836 0.448199 0.893720 1.042432 0 -1.344304 -1.187437 0.194204 -0.239038 -1.870789 0.677211 0.171752 0.220953 -0.496548 0.141587 -0.022015 1.975331 0.170902 -0.350752 0.079815 0.821160 0.417606 -0.566178 0.225239 0.798645 -0.920622 0.277911 0.273220 -0.062371 0.704176 -0.296426 -0.126682
1 1.142801 0.132587 -0.100904 -0.089055 -0.063933 -1.322580 -0.441881 1.536166 -0.961807 4.022014 0.008961 0.444393 -0.196754 -0.385971 -0.148440 0.145382 0.231715 0.776536 -1.124104 0.496148 1 0.739219 0.016162 0.032533 0.277422 0.883087 0.645788 -6.192823 -1.455988 0.027221 0.060955 -0.713104 -0.208495 -0.855011 -0.788502 1.091045 -0.211516 0.409207 0.841658 -0.249115 0.442407 0.339085 -1.971428 0.477096 -0.016694 0.455889 4.615693 -0.105722 -0.917994 0.159267 -0.124400 -0.108598 0.296116 1.566661 0.337378 1.587329 -0.984942 2.639889 0.065147 0.485437 -0.201068 -0.426977 3.560195 0.458053 -0.178596 0.537534 -0.013551 1.042432 1 -0.584439 -0.958566 0.194204 0.464752 0.624102 0.428260 0.171752 0.220953 -0.838222 -1.220549 -1.148044 0.372426 2.356462 1.020251 1.115392 1.314311 0.461060 -0.601426 -0.124092 0.518065 -0.603571 0.030059 0.238079 -0.062371 0.598633 3.394588 -0.126682
2 -0.248582 0.132587 -0.100904 -0.089055 -0.197999 -0.333732 0.088580 -0.838552 1.041148 -0.262657 0.008961 0.444393 -0.196754 -0.385971 -0.148440 0.145382 -2.770834 0.776536 0.503537 -3.603483 1 0.898871 1.585224 0.217755 0.400903 0.077622 0.645788 0.137490 0.172125 0.776581 0.565321 -0.191380 -0.128439 -0.335880 -0.319805 -1.932684 -2.062530 0.355032 1.132756 0.797684 -0.767774 0.899974 0.226721 0.329146 -0.016694 0.553849 -0.217292 -0.105722 0.983522 0.159267 -0.124400 -0.108598 0.083984 -1.156284 0.275774 -0.878505 1.016461 2.741807 0.065147 0.485437 -0.201068 -0.426977 3.842172 0.458053 -1.264178 -1.450164 -0.362502 -1.245545 1 1.351991 1.465379 0.194204 0.482346 0.525620 0.741754 0.171752 -0.947148 0.850047 1.066156 -0.129526 0.173402 -0.690070 0.614471 -0.955762 -1.005369 0.476061 -0.770848 -1.376946 0.254973 -0.674853 -0.786888 -1.048079 -0.062371 0.910428 3.394588 -0.126682
3 -0.105832 -11.554710 -0.100904 17.677702 3.607702 -1.360483 -3.235993 0.673606 1.041148 -0.249803 0.008961 0.444393 -0.196754 -0.385971 -0.140407 0.145382 -5.606576 -3.482346 -3.068231 -3.089020 0 -1.609955 -1.059766 0.217755 -0.072442 -2.338775 -1.645802 -3.510487 -1.455988 -1.451245 -2.019677 0.775719 4.677981 -0.132888 0.518749 -2.297617 0.446036 0.399059 -1.998463 -0.497970 0.755224 1.302547 0.152640 0.408241 -0.016694 -0.810967 -0.217292 -0.105722 0.248804 0.159267 -0.124400 -0.108598 -3.134801 -0.016005 0.231251 -0.419239 1.016461 -0.339494 0.065147 -2.126710 -0.201068 2.430384 -0.212658 0.458053 0.296347 0.157861 0.265609 0.342580 0 -1.344304 -1.187437 0.194204 0.200831 -1.969272 -1.600233 0.171752 -0.947148 -0.597040 -1.267947 3.322122 -2.724295 0.294037 -1.309324 -0.495506 0.429957 0.443906 0.698506 -0.199586 0.565256 1.644080 0.219150 1.389216 -0.062371 -1.520433 -0.296426 -0.126682
4 -0.340030 0.132587 -0.100904 -0.089055 0.089821 -0.033490 -0.053916 -0.515100 -0.961807 -0.258372 0.008961 0.444393 -0.196754 -0.385971 -0.140407 0.145382 0.231715 -0.196171 0.458324 0.496148 0 -0.606424 -0.252820 0.217755 -0.710430 -0.023061 -0.136936 0.137490 0.172125 -1.289221 0.170152 0.453777 0.015985 -0.100264 0.375270 -0.629352 -2.699076 -1.968822 0.555322 0.825359 -1.062521 -0.656780 0.444679 -1.703507 -0.016694 -0.236375 -0.217292 -0.105722 0.759148 -1.527793 2.154738 -0.108598 -0.489120 1.187398 -0.011526 -1.208647 -0.984942 -0.346288 0.065147 0.485437 -0.201068 -0.426977 -0.218298 0.458053 0.974836 1.129377 -0.467187 1.042432 0 -0.118716 0.497882 0.194204 0.482346 0.230172 -0.549105 0.171752 0.220953 -0.717631 -0.781100 -0.279474 -0.271087 -0.530890 -0.323035 0.655136 0.690759 0.144079 1.563471 0.503010 -0.610026 -0.239157 0.160390 0.490975 -0.062371 -0.736722 -0.296426 -0.126682
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
297227 -1.853771 -0.046966 0.032971 0.010201 2.558985 -2.422389 -6.824064 -0.579400 -0.961807 -0.254088 0.008961 0.444393 -0.196754 -0.385971 -0.140407 0.145382 0.231715 -0.800828 0.820022 -0.854319 0 -1.427495 -1.059766 0.217755 0.400903 0.681721 -0.221810 0.137490 0.172125 -4.225901 -0.563545 0.550487 2.137625 -4.234803 -3.838225 -1.046418 0.401361 0.281966 -5.652549 0.481140 0.452266 -1.164403 0.517688 -1.879486 -0.016694 -0.377855 -0.217292 -0.105722 -1.093777 -8.569435 11.667665 -0.108598 -3.328317 -0.498092 1.955612 -0.612345 -0.984942 -0.342891 0.065147 0.485437 -0.201068 -0.426977 -0.212658 0.458053 0.974836 1.363881 0.893720 1.042432 0 -1.344304 -0.126310 -0.397548 0.148046 -2.067754 -1.581792 0.171752 0.220953 -1.220092 -0.014575 -0.176208 -2.627045 -0.652406 0.512028 -0.553038 -0.674106 -0.455577 1.294807 0.666404 -3.098122 -1.118615 0.057559 -1.604920 -0.062371 -1.024164 -0.296426 -0.126682
297228 -0.469272 0.132587 -0.100904 -0.089055 0.301702 -0.565699 -0.920337 0.118876 -0.961807 -0.262657 0.008961 0.444393 -0.196754 -0.385971 -0.148440 0.145382 0.231715 0.776536 -1.395378 0.496148 1 0.191839 0.823108 0.217755 0.400903 0.916649 0.645788 0.137490 0.172125 0.675316 -0.490746 -0.751279 0.450550 -1.062146 -0.887343 0.882512 0.763978 0.381254 0.492754 -1.226214 0.824493 -0.510030 0.239814 0.277296 -0.016694 0.578277 -0.217292 -0.105722 -0.099418 0.159267 -0.124400 -0.108598 0.619077 -0.557174 -2.272405 1.193751 -0.984942 -0.349686 0.065147 0.485437 -0.201068 -0.426977 -0.218298 0.458053 -0.300724 0.392365 -1.583829 1.042432 0 -1.001139 1.298928 -2.304308 0.482346 1.149342 0.741754 0.171752 0.220953 0.186799 -1.953488 -1.133898 0.820986 -0.758058 -0.891127 1.230456 0.576328 -1.380161 0.442004 0.638426 -1.439970 -1.056045 0.147257 -1.119598 -0.062371 0.632656 -0.296426 -0.126682
297229 0.972296 0.132587 -0.100904 -0.089055 -2.924514 0.255512 -0.699485 -0.125665 1.041148 -0.258372 0.008961 -2.326512 -0.196754 2.694919 -0.140407 0.145382 0.231715 0.776536 0.820022 0.496148 0 -1.609955 1.540393 0.217755 0.051039 -2.338775 -0.193519 0.137490 0.172125 -1.127197 -1.721791 0.056758 -2.757989 0.053145 -0.783401 -1.359218 0.043962 -2.242039 -0.687611 0.855956 -3.070256 -1.382633 0.322996 -2.759380 -0.016694 0.303142 -0.217292 -0.105722 0.613270 0.024791 0.057270 -0.108598 0.688691 0.849491 -0.338029 -0.270334 1.016461 -0.346288 0.065147 -2.126710 -0.201068 2.430384 -0.212658 0.458053 0.974836 0.705037 0.893720 0.001627 0 1.180408 1.122073 0.194204 0.376778 0.492792 0.741754 0.171752 0.220953 0.809850 0.910623 -0.858049 0.424897 -0.131917 0.163013 0.712668 1.203670 0.296622 -2.112631 0.601046 0.811148 1.441482 0.204460 0.967315 -0.062371 0.407726 -0.296426 -0.126682
297230 -1.191522 0.132587 -0.100904 -0.089055 -1.232935 -0.088134 1.184868 -0.039483 -0.961807 -0.258372 0.008961 0.444393 -0.196754 -0.385971 -0.148440 0.145382 0.231715 -1.221458 0.729598 -1.947554 0 -1.564340 -1.059766 0.217755 0.400903 -1.600431 0.636358 0.137490 0.172125 0.027221 -1.564769 0.326527 -1.141185 -0.481003 0.033153 -0.681486 0.562190 0.366533 -0.626483 0.473491 0.020937 -1.207100 0.055294 -0.359667 -0.016694 0.520683 -0.217292 -0.105722 -1.689200 0.098142 -0.124400 -0.009235 3.209776 0.089409 0.188688 -0.292707 -0.984942 -0.342891 0.065147 0.485437 -0.201068 -0.426977 -0.212658 0.458053 -0.029328 -0.054309 0.300504 -0.159877 0 -1.246257 -1.187437 -5.986325 -2.262434 -1.739479 -1.194535 0.171752 0.220953 -0.436253 -2.078260 3.496119 3.023152 1.592725 2.110534 -0.265377 -0.152502 0.365004 0.082545 0.355955 -1.410218 -0.658166 0.131010 -0.665850 -0.062371 -1.202690 -0.296426 -0.126682
297231 1.021223 0.132587 -0.100904 -0.089055 -3.243742 -1.251076 2.920830 -0.542772 1.041148 -0.258372 0.008961 -2.326512 -0.196754 2.694919 -0.140407 0.145382 0.231715 -1.379194 -4.153325 0.496148 0 -1.290650 -1.059766 0.217755 0.236261 -2.238092 0.296863 0.137490 0.172125 -4.225901 -1.379508 0.553032 -2.868530 -0.678170 0.661271 -1.150685 -2.091497 0.392554 0.467026 0.435245 0.236602 -0.789616 0.298659 -0.359667 -0.016694 -1.063582 -0.217292 -0.105722 1.616759 0.159267 -0.124400 -0.108598 -0.450157 -0.849058 0.694404 -0.885394 1.016461 -0.342891 0.065147 -2.126710 -0.201068 2.430384 -0.212658 0.458053 0.974836 0.749704 0.684350 1.042432 0 0.101890 -1.156227 0.194204 0.464752 1.083687 -0.890261 0.171752 0.220953 0.307389 1.193597 1.420887 -0.561030 -0.758598 0.445064 0.770200 -0.126422 0.401825 -0.199224 0.285151 0.752801 0.317986 0.233841 0.667901 -0.062371 -0.218342 -0.296426 -0.126682

297178 rows × 96 columns

Analysis of highly correlated variables

Visulization of correlation heatmap before features selection

In [35]:
correlations = X_df_cleaned.corr()
fig_1 = plt.figure(figsize=(30, 20))
sns.heatmap(correlations, xticklabels=correlations.columns, yticklabels=correlations.columns, 
            cmap='YlGnBu')
plt.title('Heatmap for features correlation before feature selection\n', fontsize=30)
plt.show()

Build the list of pairs of features with a correlation (in absolute value) higher than +0.6 or less than -0.6

In [36]:
df_corr = X_df_cleaned.corr()
l = list()
for i in X_df_cleaned.columns:
    for j in X_df_cleaned.columns:
        if abs(df_corr.loc[i,j]) > 0.6 and df_corr.loc[i,j] != 1.0:
            if [i, j, df_corr.loc[i,j]] not in l and [j, i, df_corr.loc[j,i]] not in l :
                l.append([i, j, df_corr.loc[i,j]])

df_val = pd.DataFrame(l, columns=['feature1','feature2','val'])

# selecting negative correlated features 
df_negative_corr = df_val[df_val['val'] < 0]

# selecting positive correlated features 
df_positive_corr = df_val[df_val['val'] >= 0]

print("---- Negative Correlated Features (Less than -0.6)----\n")
print(df_negative_corr)

print("\n\n\n---- Positive Correlated Features (Higher than +0.6) ----\n")
print(df_positive_corr)

df_val = df_val.sort_values(by=['val'], ascending = False)

print("\n\n\n---- Summary correlated features ----")
print("- Negative correlated features : ", df_negative_corr.shape[0])
print("- Positive correlated features : ", df_positive_corr.shape[0])
---- Negative Correlated Features (Less than -0.6)----

   feature1 feature2       val
0     col_1    col_2 -0.753455
1     col_1    col_3 -0.682144
9    col_11   col_13 -0.869925
15   col_17   col_64 -0.684595
18   col_20   col_31 -0.691420
21   col_20   col_64 -0.905564
22   col_20   col_65 -0.709576
23   col_20   col_67 -0.605198
26   col_21   col_64 -0.672251
31   col_35   col_64 -0.695911
38   col_49   col_50 -0.789892
39   col_49   col_51 -0.674776
46   col_59   col_61 -0.881814
50   col_64   col_68 -0.866379
53   col_65   col_68 -0.795163
55   col_67   col_68 -0.669523
57   col_68   col_83 -0.611771



---- Positive Correlated Features (Higher than +0.6) ----

   feature1 feature2       val
2     col_4   col_32  0.964409
3     col_5   col_33  0.702345
4     col_7   col_55  0.779731
5     col_8   col_41  0.628739
6     col_8   col_56  0.930032
7     col_8   col_89  0.625088
8     col_9   col_46  0.839066
10   col_11   col_59  0.628477
11   col_14   col_47  0.758424
12   col_17   col_20  0.762286
13   col_17   col_21  0.637138
14   col_17   col_35  0.684365
16   col_17   col_68  0.631392
17   col_20   col_21  0.742932
19   col_20   col_35  0.774532
20   col_20   col_45  0.616101
24   col_20   col_68  0.827511
25   col_21   col_35  0.617773
27   col_21   col_68  0.615012
28   col_26   col_45  0.941012
29   col_28   col_42  0.714455
30   col_31   col_64  0.625347
32   col_35   col_68  0.638093
33   col_37   col_40  0.691585
34   col_37   col_43  0.689994
35   col_40   col_43  0.732007
36   col_41   col_56  0.609972
37   col_41   col_89  0.721098
40   col_52   col_80  0.948570
41   col_53   col_81  0.725788
42   col_54   col_82  0.613429
43   col_56   col_89  0.643693
44   col_57   col_62  0.609123
45   col_57   col_94  0.840101
47   col_63   col_65  0.625020
48   col_64   col_65  0.901831
49   col_64   col_67  0.780447
51   col_64   col_83  0.698302
52   col_65   col_67  0.853770
54   col_65   col_83  0.789975
56   col_67   col_83  0.819769
58   col_74   col_93  0.922927
59   col_76   col_90  0.767721
60   col_85   col_88  0.632677
61   col_85   col_91  0.600124
62   col_88   col_91  0.775888



---- Summary correlated features ----
- Negative correlated features :  17
- Positive correlated features :  46

On the 96 features that compose the dataset, we have 17 features that are negatively correlated and 46 features that are positively correlated

1st Features Selection method : Featurewiz

Removal of correlated variables with Featurewiz features selection techniques

Featurewiz is a python library that can find the best features in our dataset if we give it a dataframe and the name of the target variable. It will do the following:

  • It will automatically remove highly correlated features (the limit is set to 0.5 but we can change it in the input argument).

  • If several features are correlated with each other, which one to delete? In such a conflict, the algorithm will delete the feature with the lowest mutual information score.

  • Finally the algorithm will do a recursive selection of features using the XGBoost algorithm to find the best features using XGBoost.

In [37]:
# Join train data and train labels in the same dataframe
data = pd.concat([X_df_cleaned, y_df_cleaned], axis=1).reindex(X_df_cleaned.index)
data
Out[37]:
col_0 col_1 col_2 col_3 col_4 col_5 col_6 col_7 col_8 col_9 col_10 col_11 col_12 col_13 col_14 col_15 col_16 col_17 col_18 col_19 col_20 col_21 col_22 col_23 col_24 col_25 col_26 col_27 col_28 col_29 col_30 col_31 col_32 col_33 col_34 col_35 col_36 col_37 col_38 col_39 col_40 col_41 col_42 col_43 col_44 col_45 col_46 col_47 col_48 col_49 col_50 col_51 col_52 col_53 col_54 col_55 col_56 col_57 col_58 col_59 col_60 col_61 col_62 col_63 col_64 col_65 col_66 col_67 col_68 col_69 col_70 col_71 col_72 col_73 col_74 col_75 col_76 col_77 col_78 col_79 col_80 col_81 col_82 col_83 col_84 col_85 col_86 col_87 col_88 col_89 col_90 col_91 col_92 col_93 col_94 col_95 label_0
0 1.302253 0.132587 -0.100904 -0.089055 -0.183374 0.202542 0.428353 -0.767115 1.041148 -0.258372 0.008961 0.444393 -0.196754 -0.385971 -0.148440 0.145382 0.231715 -0.853406 0.820022 -1.031166 0 0.830449 -0.174367 0.217755 0.400903 0.715282 0.645788 0.137490 0.172125 0.736075 0.915602 3.127295 -0.030674 0.017933 0.381966 -0.837885 0.874914 0.360028 -1.176633 0.779463 -0.821182 -0.965149 0.347332 -0.615637 -0.016694 0.727583 -0.217292 -0.105722 1.401711 0.159267 -0.124400 -0.108598 2.255700 -0.396098 -1.360661 -0.677214 1.016461 -0.346288 0.065147 0.485437 -0.201068 -0.426977 -0.218298 0.458053 0.974836 0.448199 0.893720 1.042432 0 -1.344304 -1.187437 0.194204 -0.239038 -1.870789 0.677211 0.171752 0.220953 -0.496548 0.141587 -0.022015 1.975331 0.170902 -0.350752 0.079815 0.821160 0.417606 -0.566178 0.225239 0.798645 -0.920622 0.277911 0.273220 -0.062371 0.704176 -0.296426 -0.126682 1
1 1.142801 0.132587 -0.100904 -0.089055 -0.063933 -1.322580 -0.441881 1.536166 -0.961807 4.022014 0.008961 0.444393 -0.196754 -0.385971 -0.148440 0.145382 0.231715 0.776536 -1.124104 0.496148 1 0.739219 0.016162 0.032533 0.277422 0.883087 0.645788 -6.192823 -1.455988 0.027221 0.060955 -0.713104 -0.208495 -0.855011 -0.788502 1.091045 -0.211516 0.409207 0.841658 -0.249115 0.442407 0.339085 -1.971428 0.477096 -0.016694 0.455889 4.615693 -0.105722 -0.917994 0.159267 -0.124400 -0.108598 0.296116 1.566661 0.337378 1.587329 -0.984942 2.639889 0.065147 0.485437 -0.201068 -0.426977 3.560195 0.458053 -0.178596 0.537534 -0.013551 1.042432 1 -0.584439 -0.958566 0.194204 0.464752 0.624102 0.428260 0.171752 0.220953 -0.838222 -1.220549 -1.148044 0.372426 2.356462 1.020251 1.115392 1.314311 0.461060 -0.601426 -0.124092 0.518065 -0.603571 0.030059 0.238079 -0.062371 0.598633 3.394588 -0.126682 0
2 -0.248582 0.132587 -0.100904 -0.089055 -0.197999 -0.333732 0.088580 -0.838552 1.041148 -0.262657 0.008961 0.444393 -0.196754 -0.385971 -0.148440 0.145382 -2.770834 0.776536 0.503537 -3.603483 1 0.898871 1.585224 0.217755 0.400903 0.077622 0.645788 0.137490 0.172125 0.776581 0.565321 -0.191380 -0.128439 -0.335880 -0.319805 -1.932684 -2.062530 0.355032 1.132756 0.797684 -0.767774 0.899974 0.226721 0.329146 -0.016694 0.553849 -0.217292 -0.105722 0.983522 0.159267 -0.124400 -0.108598 0.083984 -1.156284 0.275774 -0.878505 1.016461 2.741807 0.065147 0.485437 -0.201068 -0.426977 3.842172 0.458053 -1.264178 -1.450164 -0.362502 -1.245545 1 1.351991 1.465379 0.194204 0.482346 0.525620 0.741754 0.171752 -0.947148 0.850047 1.066156 -0.129526 0.173402 -0.690070 0.614471 -0.955762 -1.005369 0.476061 -0.770848 -1.376946 0.254973 -0.674853 -0.786888 -1.048079 -0.062371 0.910428 3.394588 -0.126682 0
3 -0.105832 -11.554710 -0.100904 17.677702 3.607702 -1.360483 -3.235993 0.673606 1.041148 -0.249803 0.008961 0.444393 -0.196754 -0.385971 -0.140407 0.145382 -5.606576 -3.482346 -3.068231 -3.089020 0 -1.609955 -1.059766 0.217755 -0.072442 -2.338775 -1.645802 -3.510487 -1.455988 -1.451245 -2.019677 0.775719 4.677981 -0.132888 0.518749 -2.297617 0.446036 0.399059 -1.998463 -0.497970 0.755224 1.302547 0.152640 0.408241 -0.016694 -0.810967 -0.217292 -0.105722 0.248804 0.159267 -0.124400 -0.108598 -3.134801 -0.016005 0.231251 -0.419239 1.016461 -0.339494 0.065147 -2.126710 -0.201068 2.430384 -0.212658 0.458053 0.296347 0.157861 0.265609 0.342580 0 -1.344304 -1.187437 0.194204 0.200831 -1.969272 -1.600233 0.171752 -0.947148 -0.597040 -1.267947 3.322122 -2.724295 0.294037 -1.309324 -0.495506 0.429957 0.443906 0.698506 -0.199586 0.565256 1.644080 0.219150 1.389216 -0.062371 -1.520433 -0.296426 -0.126682 1
4 -0.340030 0.132587 -0.100904 -0.089055 0.089821 -0.033490 -0.053916 -0.515100 -0.961807 -0.258372 0.008961 0.444393 -0.196754 -0.385971 -0.140407 0.145382 0.231715 -0.196171 0.458324 0.496148 0 -0.606424 -0.252820 0.217755 -0.710430 -0.023061 -0.136936 0.137490 0.172125 -1.289221 0.170152 0.453777 0.015985 -0.100264 0.375270 -0.629352 -2.699076 -1.968822 0.555322 0.825359 -1.062521 -0.656780 0.444679 -1.703507 -0.016694 -0.236375 -0.217292 -0.105722 0.759148 -1.527793 2.154738 -0.108598 -0.489120 1.187398 -0.011526 -1.208647 -0.984942 -0.346288 0.065147 0.485437 -0.201068 -0.426977 -0.218298 0.458053 0.974836 1.129377 -0.467187 1.042432 0 -0.118716 0.497882 0.194204 0.482346 0.230172 -0.549105 0.171752 0.220953 -0.717631 -0.781100 -0.279474 -0.271087 -0.530890 -0.323035 0.655136 0.690759 0.144079 1.563471 0.503010 -0.610026 -0.239157 0.160390 0.490975 -0.062371 -0.736722 -0.296426 -0.126682 0
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
297227 -1.853771 -0.046966 0.032971 0.010201 2.558985 -2.422389 -6.824064 -0.579400 -0.961807 -0.254088 0.008961 0.444393 -0.196754 -0.385971 -0.140407 0.145382 0.231715 -0.800828 0.820022 -0.854319 0 -1.427495 -1.059766 0.217755 0.400903 0.681721 -0.221810 0.137490 0.172125 -4.225901 -0.563545 0.550487 2.137625 -4.234803 -3.838225 -1.046418 0.401361 0.281966 -5.652549 0.481140 0.452266 -1.164403 0.517688 -1.879486 -0.016694 -0.377855 -0.217292 -0.105722 -1.093777 -8.569435 11.667665 -0.108598 -3.328317 -0.498092 1.955612 -0.612345 -0.984942 -0.342891 0.065147 0.485437 -0.201068 -0.426977 -0.212658 0.458053 0.974836 1.363881 0.893720 1.042432 0 -1.344304 -0.126310 -0.397548 0.148046 -2.067754 -1.581792 0.171752 0.220953 -1.220092 -0.014575 -0.176208 -2.627045 -0.652406 0.512028 -0.553038 -0.674106 -0.455577 1.294807 0.666404 -3.098122 -1.118615 0.057559 -1.604920 -0.062371 -1.024164 -0.296426 -0.126682 0
297228 -0.469272 0.132587 -0.100904 -0.089055 0.301702 -0.565699 -0.920337 0.118876 -0.961807 -0.262657 0.008961 0.444393 -0.196754 -0.385971 -0.148440 0.145382 0.231715 0.776536 -1.395378 0.496148 1 0.191839 0.823108 0.217755 0.400903 0.916649 0.645788 0.137490 0.172125 0.675316 -0.490746 -0.751279 0.450550 -1.062146 -0.887343 0.882512 0.763978 0.381254 0.492754 -1.226214 0.824493 -0.510030 0.239814 0.277296 -0.016694 0.578277 -0.217292 -0.105722 -0.099418 0.159267 -0.124400 -0.108598 0.619077 -0.557174 -2.272405 1.193751 -0.984942 -0.349686 0.065147 0.485437 -0.201068 -0.426977 -0.218298 0.458053 -0.300724 0.392365 -1.583829 1.042432 0 -1.001139 1.298928 -2.304308 0.482346 1.149342 0.741754 0.171752 0.220953 0.186799 -1.953488 -1.133898 0.820986 -0.758058 -0.891127 1.230456 0.576328 -1.380161 0.442004 0.638426 -1.439970 -1.056045 0.147257 -1.119598 -0.062371 0.632656 -0.296426 -0.126682 0
297229 0.972296 0.132587 -0.100904 -0.089055 -2.924514 0.255512 -0.699485 -0.125665 1.041148 -0.258372 0.008961 -2.326512 -0.196754 2.694919 -0.140407 0.145382 0.231715 0.776536 0.820022 0.496148 0 -1.609955 1.540393 0.217755 0.051039 -2.338775 -0.193519 0.137490 0.172125 -1.127197 -1.721791 0.056758 -2.757989 0.053145 -0.783401 -1.359218 0.043962 -2.242039 -0.687611 0.855956 -3.070256 -1.382633 0.322996 -2.759380 -0.016694 0.303142 -0.217292 -0.105722 0.613270 0.024791 0.057270 -0.108598 0.688691 0.849491 -0.338029 -0.270334 1.016461 -0.346288 0.065147 -2.126710 -0.201068 2.430384 -0.212658 0.458053 0.974836 0.705037 0.893720 0.001627 0 1.180408 1.122073 0.194204 0.376778 0.492792 0.741754 0.171752 0.220953 0.809850 0.910623 -0.858049 0.424897 -0.131917 0.163013 0.712668 1.203670 0.296622 -2.112631 0.601046 0.811148 1.441482 0.204460 0.967315 -0.062371 0.407726 -0.296426 -0.126682 1
297230 -1.191522 0.132587 -0.100904 -0.089055 -1.232935 -0.088134 1.184868 -0.039483 -0.961807 -0.258372 0.008961 0.444393 -0.196754 -0.385971 -0.148440 0.145382 0.231715 -1.221458 0.729598 -1.947554 0 -1.564340 -1.059766 0.217755 0.400903 -1.600431 0.636358 0.137490 0.172125 0.027221 -1.564769 0.326527 -1.141185 -0.481003 0.033153 -0.681486 0.562190 0.366533 -0.626483 0.473491 0.020937 -1.207100 0.055294 -0.359667 -0.016694 0.520683 -0.217292 -0.105722 -1.689200 0.098142 -0.124400 -0.009235 3.209776 0.089409 0.188688 -0.292707 -0.984942 -0.342891 0.065147 0.485437 -0.201068 -0.426977 -0.212658 0.458053 -0.029328 -0.054309 0.300504 -0.159877 0 -1.246257 -1.187437 -5.986325 -2.262434 -1.739479 -1.194535 0.171752 0.220953 -0.436253 -2.078260 3.496119 3.023152 1.592725 2.110534 -0.265377 -0.152502 0.365004 0.082545 0.355955 -1.410218 -0.658166 0.131010 -0.665850 -0.062371 -1.202690 -0.296426 -0.126682 0
297231 1.021223 0.132587 -0.100904 -0.089055 -3.243742 -1.251076 2.920830 -0.542772 1.041148 -0.258372 0.008961 -2.326512 -0.196754 2.694919 -0.140407 0.145382 0.231715 -1.379194 -4.153325 0.496148 0 -1.290650 -1.059766 0.217755 0.236261 -2.238092 0.296863 0.137490 0.172125 -4.225901 -1.379508 0.553032 -2.868530 -0.678170 0.661271 -1.150685 -2.091497 0.392554 0.467026 0.435245 0.236602 -0.789616 0.298659 -0.359667 -0.016694 -1.063582 -0.217292 -0.105722 1.616759 0.159267 -0.124400 -0.108598 -0.450157 -0.849058 0.694404 -0.885394 1.016461 -0.342891 0.065147 -2.126710 -0.201068 2.430384 -0.212658 0.458053 0.974836 0.749704 0.684350 1.042432 0 0.101890 -1.156227 0.194204 0.464752 1.083687 -0.890261 0.171752 0.220953 0.307389 1.193597 1.420887 -0.561030 -0.758598 0.445064 0.770200 -0.126422 0.401825 -0.199224 0.285151 0.752801 0.317986 0.233841 0.667901 -0.062371 -0.218342 -0.296426 -0.126682 1

297178 rows × 97 columns

In [38]:
# specify the target variables for featurewiz
target = 'label_0' 

# Apply featurewiz method for features selection
features, train = featurewiz(data, target, corr_limit=0.5, verbose=2, sep=",",
header=0,test_data="", feature_engg="", category_encoders="")
############################################################################################
############       F A S T   F E A T U R E  E N G G    A N D    S E L E C T I O N ! ########
# Be judicious with featurewiz. Don't use it to create too many un-interpretable features! #
############################################################################################
Skipping feature engineering since no feature_engg input...
Skipping category encoding since no category encoders specified in input...
**INFO: featurewiz can now read feather formatted files. Loading train data...
    Shape of your Data Set loaded: (297178, 97)
    Caution: We will try to reduce the memory usage of dataframe from 218.79 MB
        memory usage after optimization is: 56.40 MB
        decreased by 74.2%
     Loaded. Shape = (297178, 97)
No test data filename given...
Classifying features using a random sample of 10000 rows from dataset...
#### Single_Label Binary_Classification Feature Selection Started ####
    loading a random sample of 10000 rows into pandas for EDA
############## C L A S S I F Y I N G  V A R I A B L E S  ####################
Classifying variables in data set...
    96 Predictors classified...
        No variables removed since no ID or low-information variables found in data set
No GPU active on this device
    Tuning XGBoost using CPU hyper-parameters. This will take time...
Removing 0 columns from further processing since ID or low information variables
    columns removed: []
    After removing redundant variables from further processing, features left = 96
No interactions created for categorical vars since feature engg does not specify it
#### Single_Label Binary_Classification Feature Selection Started ####
Searching for highly correlated variables from 96 variables using SULOV method
#####  SULOV : Searching for Uncorrelated List Of Variables (takes time...) ############
There are no null values in dataset.
    Caution: We will try to reduce the memory usage of dataframe from 40.24 MB
        memory usage after optimization is: 40.24 MB
        decreased by 0.0%
    Removing (45) highly correlated variables:
Time taken for SULOV method = 355 seconds
    Adding 0 categorical variables to reduced numeric variables  of 51
Final list of selected vars after SULOV = 51
############## F E A T U R E   S E L E C T I O N  ####################
    using regular XGBoost
Train and Test loaded into Dask dataframes successfully after feature_engg completed
Current number of predictors = 51 
    XGBoost version: 1.5.2
Number of booster rounds = 20
        using 51 variables...
            Time taken for regular XGBoost feature selection = 11 seconds
        using 41 variables...
            Time taken for regular XGBoost feature selection = 21 seconds
        using 31 variables...
            Time taken for regular XGBoost feature selection = 30 seconds
        using 21 variables...
            Time taken for regular XGBoost feature selection = 36 seconds
        using 11 variables...
            Time taken for regular XGBoost feature selection = 40 seconds
        using 1 variables...
            Time taken for regular XGBoost feature selection = 43 seconds
            Total time taken for XGBoost feature selection = 45 seconds
No ID variables [] are selected since they are not considered important for modeling
Selected 25 important features:
['col_48', 'col_0', 'col_72', 'col_19', 'col_8', 'col_91', 'col_64', 'col_94', 'col_92', 'col_43', 'col_7', 'col_13', 'col_30', 'col_49', 'col_78', 'col_74', 'col_95', 'col_9', 'col_1', 'col_53', 'col_5', 'col_29', 'col_12', 'col_47', 'col_6']
    Time taken = 400 seconds
    Reverted column names to original names given in train dataset
Returning list of 25 important features and a dataframe.
In [39]:
selected_features_featurewiz = features
df_selected_features_featurewiz = pd.DataFrame(selected_features_featurewiz, columns=['Featurewiz Selected Features'])
df_selected_features_featurewiz
Out[39]:
Featurewiz Selected Features
0 col_48
1 col_0
2 col_72
3 col_19
4 col_8
5 col_91
6 col_64
7 col_94
8 col_92
9 col_43
10 col_7
11 col_13
12 col_30
13 col_49
14 col_78
15 col_74
16 col_95
17 col_9
18 col_1
19 col_53
20 col_5
21 col_29
22 col_12
23 col_47
24 col_6

Résultats :

Featurewiz has selected 25 variables. The next step to verify the job of featurewiz by analyzing the new correlation matrix.

Visulization of correlation heatmap after features selection

In [40]:
X_df_cleaned_features_selected_featurewiz = X_df_cleaned[selected_features_featurewiz]
correlations = X_df_cleaned_features_selected_featurewiz.corr()
fig_1 = plt.figure(figsize=(30, 20))
sns.heatmap(correlations, xticklabels=correlations.columns, yticklabels=correlations.columns, 
            cmap='YlGnBu', annot=True, linewidths=.5)
plt.title('Heatmap for features correlation after feature selection with featurewiz\n', fontsize=30)
plt.show()

We notice that all correlated variables with a correlation coefficient of +0.5 or -0.5 have been removed. In conclusion, featurewiz seems to have done its job well since we have obtained a list of features that are not very correlated anymore

PCA on selected features by featurewiz

In [41]:
nb = 50000 # analyze the first 50 000 observations
X_raw_df_cleaned_features_selected_featurewiz = X_raw_df_cleaned[selected_features_featurewiz]
X_pca = X_raw_df_cleaned_features_selected_featurewiz[:nb].to_numpy()
y_pca = y_df_cleaned[:nb].to_numpy()

mask = (y_pca==1).flatten()
pca = PCA(n_components=2)
pca_components = pca.fit_transform(X_pca)

label1 = pca_components[mask]
label2 = pca_components[~mask]

fig_1 = plt.figure(figsize=(20,16))

plt.scatter(label1[:,0], label1[:,1], c="lightsalmon", alpha=0.5, s=2, label="y = 1")
plt.scatter(label2[:,0], label2[:,1], c="lightskyblue", alpha=0.5, s=2, label="y = 0")

plt.xlim([-250, 300])
plt.xlabel("component 1", fontsize = 15)

plt.ylim([-250, 300])
plt.ylabel("component 2", fontsize = 15)

plt.title('PCA on the model features\n', fontsize = 30)
plt.grid(True)
plt.legend(fontsize = 15)
plt.show()

PCA results:

After the featurewiz processing, the PCA shows us that we get two poles with 3 quite distinct clusters. However, these seem to be spread out. Now it is relatively easier to find a separation of the data of each class.

2nd Features Selection method : ElasticNet

ElasticNet combine the two types of regularizations. It contains both $L1$ and $L2$ as penalty terms. I choose this features selection techniques because, it works better than Ridge and Lasso regression for most test cases.

In [42]:
%%time

def rmse_cv(model):
    rmse= np.sqrt(-cross_val_score(model, X_df_cleaned.iloc[:10000,:], y_df_cleaned.iloc[:10000,:], scoring="neg_mean_squared_error", cv = 5))
    return(rmse)

# Define parameters of elastic net
alphas = [0.0005, 0.001, 0.01, 0.03, 0.05, 0.1]
l1_ratios = [0.9, 0.8, 0.7, 0.5, 0.3, 0.2, 0.1]

# Run ElasticNet on several parameters
cv_elastic = [rmse_cv(ElasticNet(alpha = alpha, l1_ratio=l1_ratio)).mean() 
            for (alpha, l1_ratio) in product(alphas, l1_ratios)]

plt.rcParams['figure.figsize'] = (12.0, 6.0)
idx = list(product(alphas, l1_ratios))
p_cv_elastic = pd.Series(cv_elastic, index = idx)
p_cv_elastic.plot(title = "Validation - Just Do It")
plt.xlabel("alpha - l1_ratio")
plt.ylabel("rmse")
Wall time: 27 s
Out[42]:
Text(0, 0.5, 'rmse')
In [43]:
# Zoom in to the first 10 parameter pairs
plt.rcParams['figure.figsize'] = (12.0, 6.0)
idx = list(product(alphas, l1_ratios))[:10]
p_cv_elastic = pd.Series(cv_elastic[:10], index = idx)
p_cv_elastic.plot(title = "Validation - Just Do It")
plt.xlabel("alpha - l1_ratio")
plt.ylabel("rmse")
Out[43]:
Text(0, 0.5, 'rmse')

Visualize the rmse score in function of ElasticNet parameters

In [44]:
pd.DataFrame(p_cv_elastic, columns = ["rmse"])
Out[44]:
rmse
(0.0005, 0.9) 0.415588
(0.0005, 0.8) 0.415611
(0.0005, 0.7) 0.415636
(0.0005, 0.5) 0.415684
(0.0005, 0.3) 0.415634
(0.0005, 0.2) 0.415614
(0.0005, 0.1) 0.415608
(0.001, 0.9) 0.415530
(0.001, 0.8) 0.415538
(0.001, 0.7) 0.415549

Display best parameters which minimize rmse

In [45]:
pd.DataFrame(p_cv_elastic).idxmin()
Out[45]:
0    (0.001, 0.9)
dtype: object

Run ElasticNet with the best parameters

In [46]:
elastic = ElasticNet(alpha=0.001, l1_ratio=0.9)
elastic.fit(X_df_cleaned, y_df_cleaned)
Out[46]:
ElasticNet(alpha=0.001, l1_ratio=0.9)
In [47]:
coef = pd.Series(elastic.coef_, index = X_df_cleaned.columns)
print("Elastic Net picked " + str(sum(coef != 0)) + " variables and eliminated the other " +  str(sum(coef == 0)) + " variables")
Elastic Net picked 72 variables and eliminated the other 24 variables

Display the 10 best features and 10 worse features

In [48]:
imp_coef = pd.concat([coef.sort_values().head(10), coef.sort_values().tail(10)])
plt.rcParams['figure.figsize'] = (8.0, 10.0)
imp_coef.plot(kind = "barh")
plt.title("Coefficients in the Elastic Net Model")
Out[48]:
Text(0.5, 1.0, 'Coefficients in the Elastic Net Model')

Looking at this graph, we immediately notice that the features col_0 and col_48 seem to be the features having the most impact on the prediction of the target variable. In other words, these features seem to be the most revealing in our dataset and their degree of importance far exceeds that of all the other features. For the rest of the features, the difference in importance between each feature seems to be much less pronounced

In the following of this report we will analyze the first 25 features considered as the most important by the model.

Select the first 25 most important features by decreasing importance order

In [49]:
# We select the first 25 most important features
selected_features_elasticNet = pd.DataFrame(coef.sort_values(ascending=False), columns = ['selected_features'])
selected_features_elasticNet = selected_features_elasticNet[:25]
selected_features_elasticNet.dropna(subset = ['selected_features'], inplace=True)
selected_features_elasticNet = selected_features_elasticNet.index.values
pd.DataFrame(selected_features_elasticNet.tolist(), columns = ["ElasticNet Features Selected"])
Out[49]:
ElasticNet Features Selected
0 col_0
1 col_48
2 col_41
3 col_35
4 col_78
5 col_70
6 col_19
7 col_67
8 col_89
9 col_69
10 col_7
11 col_68
12 col_86
13 col_13
14 col_72
15 col_30
16 col_8
17 col_24
18 col_23
19 col_33
20 col_91
21 col_66
22 col_40
23 col_87
24 col_38

3rd Features Selection method : Random Forest Importance

Random forest importance is a variable selection technique that uses the Random Forest model to represent the importance of features in a dataset to predict the target variable. The feature importance (variable importance) describes which features are relevant. The tree-based strategies used by random forests naturally rank by how well they improve the purity of the node, or in other words a decrease in the impurity (Gini impurity) over all trees. Nodes with the greatest decrease in impurity happen at the start of the trees, while notes with the least decrease in impurity occur at the end of trees. Thus, by pruning trees below a particular node, we can create a subset of the most important features.

In [50]:
%%time

X_train, X_test, y_train, y_test = train_test_split(X_df_cleaned.iloc[:50000,:], y_df_cleaned.iloc[:50000,:], test_size=0.25, random_state=12)
rf = RandomForestClassifier(n_estimators=500, random_state=12)
rf.fit(X_train, y_train)
Wall time: 3min 2s
Out[50]:
RandomForestClassifier(n_estimators=500, random_state=12)

Display of the importance of each feature by decreasing importance order

In [51]:
importances = rf.feature_importances_

features_importances_df = pd.DataFrame({"Features": pd.DataFrame(X_df_cleaned).columns, "Importances": importances})
features_importances_df.set_index("Importances")
features_importances_df = features_importances_df.sort_values("Importances")


plt.figure(figsize=(100,100))
features_importances_df.plot.barh()
plt.xlabel("Features Importance", fontsize = 15)
plt.ylabel("Features", fontsize = 15)
plt.title("Random Forest Feature Importance", fontsize = 20)
Out[51]:
Text(0.5, 1.0, 'Random Forest Feature Importance')
<Figure size 7200x7200 with 0 Axes>

The same observation that we made for the result of the ElasticNet model can be shared here. Indeed, we also notice that the features col_0 and col_48 are the most important variables. Overall, the first 5 features selected by the Random Forest Features Importance model are identical to the first 5 features selected by the ElasticNet model. These two models confirm each other

Select the first 25 most important features

In [52]:
features_importances_df = features_importances_df.sort_values(ascending=False, by = "Importances")
selected_Features_RF = features_importances_df.iloc[:25,0].tolist()
pd.DataFrame(selected_Features_RF, columns = ["RF Selected Features"])
Out[52]:
RF Selected Features
0 col_0
1 col_48
2 col_78
3 col_41
4 col_7
5 col_91
6 col_55
7 col_79
8 col_89
9 col_43
10 col_31
11 col_30
12 col_35
13 col_69
14 col_40
15 col_65
16 col_38
17 col_86
18 col_64
19 col_93
20 col_5
21 col_72
22 col_39
23 col_83
24 col_45

Features Selection Summary

Let's analyze the 25 most important features selected by the 3 variable selection methods we implemented, namely :

  • Featurewiz
  • ElasticNet
  • Random Forest Features Importance

Grouping in a dataframe the list of the first 25 features selected by order of importance

In [53]:
df_featurewiz = pd.DataFrame(selected_features_featurewiz, columns = ["Featurewiz Features"])
df_elasticNet = pd.DataFrame(selected_features_elasticNet, columns = ["ElasticNet Features"])
df_rf = pd.DataFrame(selected_Features_RF, columns = ["RF Features"])
                             
df_summary_features_selected = pd.concat([df_featurewiz, df_elasticNet, df_rf], axis=1)
df_summary_features_selected
Out[53]:
Featurewiz Features ElasticNet Features RF Features
0 col_48 col_0 col_0
1 col_0 col_48 col_48
2 col_72 col_41 col_78
3 col_19 col_35 col_41
4 col_8 col_78 col_7
5 col_91 col_70 col_91
6 col_64 col_19 col_55
7 col_94 col_67 col_79
8 col_92 col_89 col_89
9 col_43 col_69 col_43
10 col_7 col_7 col_31
11 col_13 col_68 col_30
12 col_30 col_86 col_35
13 col_49 col_13 col_69
14 col_78 col_72 col_40
15 col_74 col_30 col_65
16 col_95 col_8 col_38
17 col_9 col_24 col_86
18 col_1 col_23 col_64
19 col_53 col_33 col_93
20 col_5 col_91 col_5
21 col_29 col_66 col_72
22 col_12 col_40 col_39
23 col_47 col_87 col_83
24 col_6 col_38 col_45

Results

We notice that the features col_0, col_48 occupy for the 3 methods the top of the ranking of the features importance. Then the features col_78 and col_41 are found just behind the features mentioned just before which also occupy the top of the ranking. Overall, the 3 models seem to select globally the same top 25 most important features.

Count occurences features selected by the 3 models

In [54]:
# Creation of a dataframe with the occurences of the features selected by the three models
features_list = list()
for i in range(0, len(selected_features_featurewiz)):
    features_list.append(selected_features_featurewiz[i])
    features_list.append(selected_features_elasticNet[i])
    features_list.append(selected_Features_RF[i])
    
df_features_list = pd.DataFrame(features_list, columns = ["Features Selected"])
df_features_occurences = pd.DataFrame(df_features_list.value_counts(), columns = ["Count"])
df_features_occurences = df_features_occurences.reset_index()
df_features_occurences
Out[54]:
Features Selected Count
0 col_0 3
1 col_91 3
2 col_72 3
3 col_78 3
4 col_48 3
5 col_30 3
6 col_7 3
7 col_40 2
8 col_13 2
9 col_89 2
10 col_35 2
11 col_38 2
12 col_69 2
13 col_41 2
14 col_43 2
15 col_86 2
16 col_64 2
17 col_19 2
18 col_8 2
19 col_5 2
20 col_70 1
21 col_94 1
22 col_74 1
23 col_92 1
24 col_79 1
25 col_93 1
26 col_87 1
27 col_9 1
28 col_83 1
29 col_6 1
30 col_68 1
31 col_67 1
32 col_12 1
33 col_23 1
34 col_24 1
35 col_29 1
36 col_31 1
37 col_33 1
38 col_39 1
39 col_45 1
40 col_47 1
41 col_49 1
42 col_53 1
43 col_55 1
44 col_1 1
45 col_65 1
46 col_66 1
47 col_95 1

Select the features that have been selected at least twice by a variable selection model

In [55]:
best_selected_features = df_features_occurences.loc[(df_features_occurences["Count"] >= 2)]
best_selected_features = best_selected_features["Features Selected"].tolist()
print("---- Best selected features ----\n")
print(best_selected_features)
---- Best selected features ----

['col_0', 'col_91', 'col_72', 'col_78', 'col_48', 'col_30', 'col_7', 'col_40', 'col_13', 'col_89', 'col_35', 'col_38', 'col_69', 'col_41', 'col_43', 'col_86', 'col_64', 'col_19', 'col_8', 'col_5']

-------- Machine Learning Approach --------

Machine Learning strategy

Through this data challenge, we have to propose a model that can solve a binary classification task. It is important to note that there is a wide range of algorithms that can provide a solution to a binary classification task. The idea here is not to try them all as this could prove to be a waste of time and not very profitable. On the other hand, it is advisable to select upstream the algorithms likely to answer best the classification problem to be solved. The strategy consists first of testing the classical approaches such as k-NN, SVM and Decision Tree. Then, it will be appropriate to apply the ensemblistic classification approaches because these methods should bring superior performances to the classical approaches. Finally, it will be necessary to test the performance of a neural network which could potentially bring even better performance than the set methods.

Below is a map of the algorithms that will be applied to try to solve the classification problem.

For each of the classification algorithms, it will be necessary to choose the best possible hyperparameters. Indeed, the choice of the hyperparameters is crucial to build the most efficient classification model possible. To do this with the python method GridSearchCV() allows by specifying upstream the list of values of each hyperparameter to test all the different combination of hyperparameter possible and thus retain the parameters of the best model that is to say those that minimizes the evaluation criterion namely the sum of false negatives and false positives. Once the values of these optimal hyperparameters are known, it is the time to train the model with these hyperparameters. In fact, the refinement of the model is the key to obtain the most efficient model for the binary classification task. However the search for the best hyperparameters can be very time consuming as the execution time of the GridSearchCV function depends very much on the possible combinations to be tested. Therefore, the search for hyperparameters will be done on a very small subsample of the training dataset. Since this sub-sample will be small, we will also do a cross evaluation to estimate the best parameters of the model. Indeed, This technique allows to avoid overfitting and to evaluate the performance of the model in a more robust way than a simple test run.

Finally, the model with the best performance on the test set will be trained on the whole dataset at our disposal and then perform the prediction on the X_test dataset.

Some comments on the choice of algorithms:

  • k-NN: Without a doubt, the k-NN algorithm is not the best algorithm to apply in our case of study because, we have a rather large dataset, yet k-NN does not scale very well because k-NN stores the entire dataset in memory to make a prediction. Therefore k-NN is not really suitable for our classification task. However, for pedagogical purposes, it can be good to implement a k-NN on a small subsample to obtain an order of magnitude of the performances with a relatively simple model.
  • Decision Tree: Even if models such as the Decision Tree are part of the weak models, an upstream implementation of a Decision Tree can be useful to determine the optimal hyperparameters and then use these hyperparameters in an Adaboost model based on a Decision Tree. The advantage of the Decision Tree model is that it performs well on large volume datasets.
  • SVM: Support Vector Machines were 10 years ago a state of the art model for classification tasks. For this reason, SVM is a serious candidate even if nowadays there are new, more advanced and more powerful techniques. However, the SVM algorithm suffers from the same scaling problem as the k-NN model. Therefore, like the k-NN algorithm, I will train the SVM model on a reduced subsample of the training dataset because otherwise the training time might be too much too long.
  • Assembly models such as Random Forests, Adaboost, XGBoost and Gradient Boosting are known to be particularly efficient models for classification tasks. Each of these approaches should be tested individually, taking care to refine the hyperparameters of each algorithm.
  • Voting Classifier that encompass several best classfier models for the binary classification task. Every machine learning classification model has its own advantages and disadvantages. By aggregating many models, we can overcome the disadvantages of each model to generalize the classification model. Overall voting classifier model improve performance. Voting classifier model are serious candidates whose stake will be to choose the complementary models which are able to work together to fill the weaknesses of the one and the other and thus to produce better performance
  • Neural Networks can be very powerful tools and can even outperform ensemble approaches on the classification task. However, they need to be trained on large datasets, and they require a very large number of calculations. Therefore, on the one hand, the dataset may not be large enough to obtain convincing results and on the other hand, depending on the configuration of the Neural Networks, the training time may be long. Finally, the last one to take into account, concerns the low computational performance of my personal computer, therefore, the Neural Networks architecture will be quite simple (limited stacking of hidden layers) but may be too simple to obtain better results than those obtained with the ensemblistic models. However, it would be a shame not to test the performance of a neural network on our classification task

Separating the data into train, validation and test sets

Before run the training of a machine learning model, it is necessary to separate the training set into 3 sub-samples.

  • Training sample {X_train, y_train} to train the model
  • Validation sample {X_valid, y_valid} : to discriminate among models trained on the train set (typically for the purpose of hyperparameter optmisation)
  • Test sample {X_test, y_test} to test the global performance of the model

For this data challenge, we have the training dataset as well as the training labels and the test set. To form the validation sample, a subsample of data will be created from the training dataset.

Since the original training set is quite large, running cross-validation methods on it will often not be possible on a standard laptop like mine. For this reason, my approach will be flexible. Depending on the model and the optimization procedures used to fine tune the hyperparameters, I will be able to work on smaller subsamples.

Model Evaluation

For the evaluation of the performance of the models, the idea is to minimize the sum of the rate of false positives rate FPR and the rate of false negatives rate FNR. The performance score of the model is calculated using the following equation.

$score = 1 - (FPR + FNR)$

This score metric represents the ability of the model to correctly predict the data.

When training a machine learning model, the amount of FPR and FNR should be minimized. In other words, the training must be able to maximize the score metric.

The predictions of the algorithm must be submitted to the Data Challenge educational site so that the performance score of the model is known.

Data Normalization

As shown on data exploration part, the data is very heterogenous and differently scaled. It may thus make sense to normalize it before using it for machine learning purposes. My preliminary attempts on normalisation produced however no improvement on the performances however. For this reason, I will typically not standardise the data, except on methodologies where this preprocessing is justified like k-NN and SVM because these methods use distance metrics in the train process. Indeed, for decision tree-based models, the normalization of the data will have no impact on the accuracy of the predictions generated by the algorithm.

Machine Learning Algorithms

Evaluation criterion function (utils for GridSearchCV)

In [56]:
def criterion_GridCV(y_true, y_pred):
    CM = confusion_matrix(y_true, y_pred)
    TN, TP = CM[0, 0], CM[1, 1]
    FP, FN = CM[0, 1], CM[1, 0]
    return 1 - (FP/(FP + TN) + FN/(FN + TP))

# specify the metric to maximize to GridSearchCV function
scoring_critetion = make_scorer(criterion_GridCV)

Model performance evaluation function

In [6]:
def criterion(y_pred, y_true):
    CM = confusion_matrix(y_true, y_pred)
    TN, TP = CM[0, 0], CM[1, 1]
    FP, FN = CM[0, 1], CM[1, 0]
    return FP/(FP + TN) + FN/(FN + TP)

Generate Train set

Generate Train dataset normalized

In [58]:
X_train = X_df_cleaned[best_selected_features]
y_train = y_df_cleaned
train = pd.concat([X_train, y_train], axis=1)
train
Out[58]:
col_0 col_91 col_72 col_78 col_48 col_30 col_7 col_40 col_13 col_89 col_35 col_38 col_69 col_41 col_43 col_86 col_64 col_19 col_8 col_5 label_0
0 1.302253 0.273220 -0.239038 0.141587 1.401711 0.915602 -0.767115 -0.821182 -0.385971 -0.920622 -0.837885 -1.176633 -1.344304 -0.965149 -0.615637 -0.566178 0.974836 -1.031166 1.041148 0.202542 1
1 1.142801 0.238079 0.464752 -1.220549 -0.917994 0.060955 1.536166 0.442407 -0.385971 -0.603571 1.091045 0.841658 -0.584439 0.339085 0.477096 -0.601426 -0.178596 0.496148 -0.961807 -1.322580 0
2 -0.248582 -1.048079 0.482346 1.066156 0.983522 0.565321 -0.838552 -0.767774 -0.385971 -0.674853 -1.932684 1.132756 1.351991 0.899974 0.329146 -0.770848 -1.264178 -3.603483 1.041148 -0.333732 0
3 -0.105832 1.389216 0.200831 -1.267947 0.248804 -2.019677 0.673606 0.755224 -0.385971 1.644080 -2.297617 -1.998463 -1.344304 1.302547 0.408241 0.698506 0.296347 -3.089020 1.041148 -1.360483 1
4 -0.340030 0.490975 0.482346 -0.781100 0.759148 0.170152 -0.515100 -1.062521 -0.385971 -0.239157 -0.629352 0.555322 -0.118716 -0.656780 -1.703507 1.563471 0.974836 0.496148 -0.961807 -0.033490 0
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
297227 -1.853771 -1.604920 0.148046 -0.014575 -1.093777 -0.563545 -0.579400 0.452266 -0.385971 -1.118615 -1.046418 -5.652549 -1.344304 -1.164403 -1.879486 1.294807 0.974836 -0.854319 -0.961807 -2.422389 0
297228 -0.469272 -1.119598 0.482346 -1.953488 -0.099418 -0.490746 0.118876 0.824493 -0.385971 -1.056045 0.882512 0.492754 -1.001139 -0.510030 0.277296 0.442004 -0.300724 0.496148 -0.961807 -0.565699 0
297229 0.972296 0.967315 0.376778 0.910623 0.613270 -1.721791 -0.125665 -3.070256 2.694919 1.441482 -1.359218 -0.687611 1.180408 -1.382633 -2.759380 -2.112631 0.974836 0.496148 1.041148 0.255512 1
297230 -1.191522 -0.665850 -2.262434 -2.078260 -1.689200 -1.564769 -0.039483 0.020937 -0.385971 -0.658166 -0.681486 -0.626483 -1.246257 -1.207100 -0.359667 0.082545 -0.029328 -1.947554 -0.961807 -0.088134 0
297231 1.021223 0.667901 0.464752 1.193597 1.616759 -1.379508 -0.542772 0.236602 2.694919 0.317986 -1.150685 0.467026 0.101890 -0.789616 -0.359667 -0.199224 0.974836 0.496148 1.041148 -1.251076 1

297178 rows × 21 columns

Train dataset not normalized

In [59]:
X_train_raw = X_raw_df_cleaned[best_selected_features]
y_train = y_df_cleaned
train_raw = pd.concat([X_train_raw, y_train], axis=1)
train_raw
Out[59]:
col_0 col_91 col_72 col_78 col_48 col_30 col_7 col_40 col_13 col_89 col_35 col_38 col_69 col_41 col_43 col_86 col_64 col_19 col_8 col_5 label_0
0 236.031 0.829000 153.0 1.059 238.253 7.806 23.831 0.686000 0.0 0.335000 84.0 0.621000 110.0 0.375000 0.801000 0.643000 255.0 160.0 1.0 -3.195 1
1 228.040 0.826418 193.0 -7.620 107.160 2.570 58.040 0.932080 0.0 0.403857 121.0 0.918158 141.0 0.649915 0.869304 0.637621 170.0 255.0 0.0 -15.950 0
2 158.310 0.731915 194.0 6.950 214.620 5.660 22.770 0.696401 0.0 0.388376 63.0 0.961017 220.0 0.768143 0.860056 0.611766 90.0 0.0 1.0 -7.680 0
3 165.464 0.911000 178.0 -7.922 173.099 -10.177 45.229 0.993000 0.0 0.892000 56.0 0.500000 110.0 0.853000 0.865000 0.836000 205.0 32.0 1.0 -16.267 1
4 153.727 0.845000 194.0 -4.820 201.940 3.239 27.574 0.639000 0.0 0.483000 88.0 0.876000 160.0 0.440000 0.733000 0.968000 255.0 255.0 0.0 -5.169 0
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
297227 77.865 0.691000 175.0 0.064 97.226 -1.256 26.619 0.934000 0.0 0.292000 80.0 -0.038000 110.0 0.333000 0.722000 0.927000 255.0 171.0 0.0 -25.148 0
297228 147.250 0.726660 194.0 -12.290 153.420 -0.810 36.990 1.006490 0.0 0.305589 117.0 0.866788 124.0 0.470933 0.856815 0.796856 161.0 255.0 0.0 -9.620 0
297229 219.495 0.880000 188.0 5.959 193.696 -8.352 33.358 0.248000 1.0 0.848000 74.0 0.693000 213.0 0.287000 0.667000 0.407000 255.0 255.0 1.0 -2.752 1
297230 111.054 0.760000 38.0 -13.085 63.577 -7.390 34.638 0.850000 0.0 0.392000 87.0 0.702000 114.0 0.324000 0.817000 0.742000 181.0 103.0 0.0 -5.626 0
297231 221.947 0.858000 193.0 7.762 250.406 -6.255 27.163 0.892000 1.0 0.604000 78.0 0.863000 169.0 0.412000 0.817000 0.699000 255.0 255.0 1.0 -15.352 1

297178 rows × 21 columns

ML Implementation steps

The implementation of the machine learnign algorithms will always follow the same steps :

  • 1/ Construction of a subsample of the training dataset which is randomly generated. The size of the sub-sample can vary in order to limit the training time of the algorithms during especially during the search especially during the search phase of the best hyperparameters by the python GridSearchCV function.
  • 2/ Separation of the subsample into a training subsample and a validation subsample. The proportions are as follows: 80% for the training subsample and 20% for the validation subsample.
  • 3/ Specification of the list of parameters that will be used by the GridSearchCV function. The purpose of the GridSearchCV function is to choose the best parameters of the algorithm that maximizes the score: $1-(FPR+FNR)$. To do this I will specify in the parameter scoring of GridSearchCV the name of the function that we defined previously thanks to the function make_scorer of the library sklearn to find the best parameters.
  • 4/ Fit the model with best parameters selected by the GridSearchCV function.
  • 5/ Compute the evaluation score $FPR+FNR$.
  • 6/ To further inspect the performance we display: the algorithm's performance evolution, the accuracy score, False Positive Rate, False Negative Rate and the confusion matrix.

The machine learning approach I adopted follows the workflow below

image.png

Basic Approaches

The objective of these different approaches is first to apply the models seen this study semester and then to obtain a preliminary estimate of the classification performance in this challenge.

1- Decision Tree Classifier

Algorithm Explanation

In decision computing, Decision Trees are widely used to solve classification problems.

They use a hierarchical representation of the data structure in the form of sequences of decisions (tests) to predict a class.

  • Each node corresponds to a class label and the attributes are represented on the inner node of the tree.
  • Any Boolean function on discrete attributes can be represented using the decision tree.

Each individual (or observation), which is to be assigned to a class, is described by a set of variables that are tested in the nodes of the tree. Tests are performed in the inner nodes and decisions are made in the leaf nodes.

Each leaf node represents the output variable $y$. The internal nodes and the root node are the input variables.

Its construction is based on a recursive partitioning of the individuals using the data. This partitioning is done by a succession of cut nodes. The cutting of a node, and what characterizes it, is done with stopping rules cut-off conditions.

To determine a plausible value of Y for an individual whose values {$X_{1},...,X{p}$} are known, we proceed step by step as follows. Starting from the root, at each node, we check if the cut-off condition is verified or not: if the condition is verified, we go to the branch associated with the answer "Yes" (answering the implicit question "Is the condition verified?"), otherwise, we go to the branch associated with the answer "No".

For the separation of a node, the algorithm uses metrics such as the Gini index (the most used metric) or entropy. For example, with the Gini index, by separating 1 node into 2 child nodes, we seek to obtain the greatest increase in purity. The Gini index measures impurity. The Gini criterion organizes the separation of the leaves of a tree by focusing on the most represented class in the dataset: it must be separated as quickly as possible.

Gini Index

$I=1-\sum_{i}^{n} f_{i}^{2}$

With :

  • $n$ : number of classes to predict
  • $f_{i}$ : frequency of the class $i$ in the noded

Stop criterions

  • Tree depth reaches a fixed limit
  • Number of leaves reaches a fixed maximum
  • The number of nodes is lower than a fixed threshold
  • The quality of the tree is sufficient
  • The quality of the tree does not increase significantly anymore

Once the tree is built, the number of leaves is sometimes too large. The model must be simplified by pruning the tree to the right depth. A good pruning corresponds to the right compromise between tree complexity and prediction accuracy.

  • A tree that is too deep = high complexity => high variance => possible of overfitting => weakened generalization power

  • A tree that is not deep enough = too low complexity => high bias => risk of underfitting

Pros and Cons

Pros

  • Simple to understand and interpret
  • Requires little data preparation
  • Very fast algorithm to execute
  • The nature of the input data does not matter. Handles both continuous and categorical data
  • Good behavior at extreme values (outliers)
  • It is efficient for large data sets

Cons

  • Non-optimal trees because of heuristic rules
  • Tree sometimes unbalanced
  • Instability: If you change a variable in the tree, the whole tree changes. Very deep trees produce high variance estimators
  • Requires a large number of individuals for the rules to have a value

Parameter choices strategy

The most important parameter for a Decesion Tree is the max_depthparameter. It represents the depth of each tree, which is the maximum number of different features used in each tree. A good practice for the choice of the parmeter is to start with a shallow depth of 2 for example and increment this value by 1 without exceeding 7. To sum up, I optimize the tree depth and the node separation criterion via a GridSearchCV.

Build sample of train data

In [57]:
# build sample of train data
temp_data = train_raw.sample(n=int(round(X_train.shape[0] * 0.5,0)), random_state=230)

X_train_reduce = temp_data.loc[:,best_selected_features]
y_train_reduce = temp_data.loc[:,["label_0"]]

Split data

In [58]:
# split into X_train, y_train, X_valid and y_valid
dec_tree_X_train, dec_tree_X_valid, dec_tree_y_train, dec_tree_y_valid = train_test_split(X_train_reduce, y_train_reduce, test_size=0.2)

Define hyperparameters and fit the model

In [59]:
%%time

# define the evaluation procedure
cv = RepeatedStratifiedKFold(n_splits=10, n_repeats=3, random_state=1)

# define grid
param_grid = {
              'ccp_alpha': [0.1, .01, .001], # learning rate
              'max_depth' : [1, 2, 3, 4, 5, 6],
              'criterion' :['gini', 'entropy']
             }

# run grid search
tree_clas = DecisionTreeClassifier(random_state=1024)
grid_search = GridSearchCV(estimator=tree_clas, param_grid=param_grid, cv=cv, scoring=scoring_critetion, n_jobs = -1, verbose=True)
grid_search.fit(dec_tree_X_train, dec_tree_y_train)

# print best model
print(grid_search.best_estimator_)
Fitting 30 folds for each of 36 candidates, totalling 1080 fits
DecisionTreeClassifier(ccp_alpha=0.001, criterion='entropy', max_depth=6,
                       random_state=1024)
Wall time: 5min 23s

Compute FPR + FNR score

In [60]:
# compute FPR + FNR score
dec_tree_y_pred = grid_search.predict(dec_tree_X_valid)
valid_score = criterion(dec_tree_y_pred, dec_tree_y_valid)
print('FPR + FNR = {}'.format(valid_score))
FPR + FNR = 0.5333265814854321

To further inspect the performance

In [61]:
# to further inspect the performance:
CM = confusion_matrix(dec_tree_y_valid, dec_tree_y_pred)
TN, TP = CM[0, 0], CM[1, 1]
FP, FN = CM[0, 1], CM[1, 0]
print('Confusion Matrix: \n {}'.format(CM))
print('Accuracy: {}'.format((TP + TN) / (TP + TN + FP + FN)))  
print('False Positive Rate: {}'.format(FP / (FP + TN)))  
print('False Negative Rate: {}'.format(FN / (FN + TP)))
print('FPR + FNR = {}'.format(FP / (FP + TN) + FN / (FN + TP)))
plt.figure(figsize=(6,4))
plt.grid()
dec_y_prob = grid_search.predict_proba(dec_tree_X_valid)[:, 1]
fpr, tpr, thresholds = roc_curve(dec_tree_y_valid, dec_y_prob, pos_label=1)
idx = np.argmin(fpr + (1-tpr))
plt.plot(fpr, 1-tpr, label='RF')
plt.plot(fpr[idx], (1-tpr)[idx], '+', color='k')
plt.legend(loc='best')
plt.xlabel('FPR')
plt.ylabel('FNR')
plt.show()
Confusion Matrix: 
 [[11311  3652]
 [ 4268 10487]]
Accuracy: 0.7334948516050879
False Positive Rate: 0.2440687028002406
False Negative Rate: 0.2892578786851915
FPR + FNR = 0.5333265814854321

Result discussion

The $FPR$ and $FNR$ rate is quite high at 0.53 enven if the accruracy is not that bad. Howerver, the algorithm does worse than the random. We also notice with the confusion matrix that the false negatives are in majority compared to the false positives. Thus, not surprisingly, the performance obtained on the validation set is quite bad. A more adequate approach is needed. It will be interesting to compare this score with the score obtained by a Random Forest.

2/ k-NN Classifier

Algorithm Explanation

To predict the class of a new data point, we compute the distance between all the other data points and among the $K$ points with the smallest distance we look at the majority class (most represented class). This is called a majority vote.

k-NN does not compute any predictive model and it fits in the framework of Lazy Learning because it manipulates already classified individuals for any new classification.

The loss function

Minimization of the distance ($r_{1}(x)$ : the nearest neighbor index)

$r_{1}(\mathbf{x})=i^{*} \quad$ if and only if $\quad d_{i^{*}}(\mathbf{x})=\min _{1 \leq i \leq n} d_{i}(\mathbf{x})$

The decision to classify point x is made by a Majority Vote:

$\hat{f}_{k}(\mathbf{x}) \in \underset{y \in \mathcal{Y}}{\arg \max }\left(\sum_{j=1}^{k} \mathbb{1}_{\left\{y_{r_{j}}=y\right\}}\right)$

Pros and Cons

Pros

  • Easy to understand
  • Easy to implement
  • Provides good performance without too much adjustment on the parameters
  • Polyvalent: Used for regression and classification

Cons

  • The larger the dataset, the slower the prediction
  • Requires a lot of storage and computing power because k-NN stores the whole dataset in memory to perform a prediction
  • Does not work when the dataset contains too many features
  • Bad on sparse datasets (null features)

Parameters choices strategy

To calculate the distance between an unclassified point and other classified data points there are several metrics such as :

  • Distance euclienne : $\sum_{i=1}^{n}\left|x_{i}-y_{i}\right|$
  • Distance Manhattan : $\sqrt{\sum_{i=1}^{n}\left(x_{i}-y_{i}\right)^{2}}$

Without any doubt, the performance of the algorithm depends strongly on the choice of the K parameter. However, it is important to find the right compromise between :

  • $k$ too small: high sensitivity to noisy points (high variance => risk of overlearning)
  • $k$ too large : the neighborhood can include points of other classes. (strong bias)

A good practice is to start the training with a $k$ number of weak neighbors and then increase this value at each iteration of the algorithm.

To sum up I optimise the number of neighbours k with a grid search. I run the grid on a small subset of the train set because k-NN becomes quickly intractable at large scale. Also, as k-NN is sensitive to the scale of the variables, I standardize the features, making it possible to use a regular Euclidian distance for the algorithm

Build sample of train data

In [62]:
# build sample of train data
temp_data = train.sample(n=int(round(X_train.shape[0] * 0.02,0)), random_state=230)

X_train_reduce = temp_data.loc[:,best_selected_features]
y_train_reduce = temp_data.loc[:,["label_0"]]

Split data

In [63]:
# split into X_train, y_train, X_valid and y_valid
knn_X_train, knn_X_valid, knn_y_train, knn_y_valid = train_test_split(X_train_reduce, y_train_reduce, test_size=0.2)

Define hyperparameters and fit the model

In [64]:
%%time

# define the evaluation procedure
cv = RepeatedStratifiedKFold(n_splits=10, n_repeats=3, random_state=1)

max_k = 60 # set the nb neighbours at 60

# define grid
grid = {
    'n_neighbors':list(range(1, max_k)),
       }

# run grid search
knn = KNeighborsClassifier()
grid_search = GridSearchCV(knn, grid, cv=cv, scoring=scoring_critetion, n_jobs = -1, return_train_score = True);
grid_search.fit(knn_X_train,knn_y_train);

# print best model
print(grid_search.best_estimator_)
KNeighborsClassifier(n_neighbors=46)
Wall time: 12min 49s

Inspect the accuracy values given the value of $k$ neighbours

In [65]:
# plot the accuracy values given the value of k
fig_1 = plt.figure(figsize=(8,6))
plt.plot(list(range(1, max_k)), grid_search.cv_results_['mean_test_score'])
plt.xlim([0, max_k])
plt.xlabel("k")
plt.ylabel("1 - (FPR + FNR)")
plt.title('Scoring Critrerion on validation set for different values of k')
plt.grid(True)
plt.show()
# print outcome
best_k = grid_search.best_estimator_.n_neighbors

print("The optimal value for k is " + str(best_k) +\
      ", which corresponds to an Scoring Criterion of " + str(grid_search.best_score_) + " on the validation set.")
The optimal value for k is 46, which corresponds to an Scoring Criterion of 0.49879797215760974 on the validation set.

Compute FPR + FNR score

In [66]:
# compute FPR + FNR score
knn_y_pred = grid_search.predict(knn_X_valid)
valid_score = criterion(knn_y_pred, knn_y_valid)
print('FPR + FNR = {}'.format(valid_score))
FPR + FNR = 0.515042220294096
In [67]:
# to further inspect the performance:
CM = confusion_matrix(knn_y_valid, knn_y_pred)
TN, TP = CM[0, 0], CM[1, 1]
FP, FN = CM[0, 1], CM[1, 0]
print('Confusion Matrix: \n {}'.format(CM))
print('Accuracy: {}'.format((TP + TN) / (TP + TN + FP + FN)))  
print('False Positive Rate: {}'.format(FP / (FP + TN)))  
print('False Negative Rate: {}'.format(FN / (FN + TP)))
print('FPR + FNR = {}'.format(FP / (FP + TN) + FN / (FN + TP)))
plt.figure(figsize=(6,4))
plt.grid()
knn_y_prob = grid_search.predict_proba(knn_X_valid)[:, 1]
fpr, tpr, thresholds = roc_curve(knn_y_valid, knn_y_prob, pos_label=1)
idx = np.argmin(fpr + (1-tpr))
plt.plot(fpr, 1-tpr, label='RF')
plt.plot(fpr[idx], (1-tpr)[idx], '+', color='k')
plt.legend(loc='best')
plt.xlabel('FPR')
plt.ylabel('FNR')
plt.show()
Confusion Matrix: 
 [[464 158]
 [148 419]]
Accuracy: 0.7426408746846089
False Positive Rate: 0.2540192926045016
False Negative Rate: 0.26102292768959434
FPR + FNR = 0.515042220294096

Result discussion

The $FPR$ and $FNR$ rate is quite high: 0.51. The algorithm does a little bit worse than the random. We also notice with the confusion matrix that the number of false positives and false negatives is almost similar. Thus, not surprisingly, the performance obtained on the validation set is quite bad. However, this is mainly due to the fact that the algorithm is train on a tiny data set, and also partly due to the fact that the k-NN algorithm does not produce accurate scores, but only approximate probabilities based on the $k$ nearest neighbors. A more adequate approach is needed.

3/ SVM

Algorithm Explanation

The Support Vector Machine SVM can be used for both classification and regression challenges. Solving a classification problem, the SVM attempts to find the hyperplane that differentiates the two classes very well. The concept of a frontier implies that the data are linearly separable. To achieve this, SVM use kernels, i.e. mathematical functions to project and separate the data in the vector space. The separation boundary is chosen as the one that maximizes the margin. Maximizing the distance between the closest data point (of either class) and the hyperplane will help us to choose the right hyperplane. This distance is called the margin. The margin allows us to be tolerant of small variations.

The margin is the distance between the hyperplane and the closest samples. The points located on the margins are called the support vectors.

In the case where the data are not linearly separable, the SVM transforms the representation space of the input data into a higher dimensional space in which a linear separation is likely to exist. This is achieved by kernel functions.

To separate the data, SVM consider a triplet of hyperplanes:

  • $H: {w}^{T}{x}+b=0, H_{1}: {w}^{T} {x}+b=1, H_{-1}: {w}^{T}{x}+b=-1$

We call geometric margin, $\rho({w})$ the smallest distance between the data and the hyperplane $H$, here therefore half the distance between $H_{1}$ and $H_{-1}$ A simple calculation gives: $ \rho({w})= \frac{1}{|{w}|}$.

image.png

The goal is pretty simply :

  • Maximize margin $\rho(\mathbf{w})$ while separating the data on both sides of $H_{1}$ and $H_{-1}$
  • Separate blue data $\left(y_{i}=1\right): \mathbf{w}^{T} x_{i}+b \geq 1$
  • Separate red data $\left(y_{i}=-1\right): \mathbf{w}^{\top} x_{i}+b \leq-1$

Optimization in the primal space

$\underset{w, b}{\operatorname{min}} \quad \frac{1}{2}\|w\|^{2}$ under the constraint $1-y_{i}\left(w^{\top} x_{i}+b\right) \leq 0, i=1, \ldots , n$.

Pros and Cons

Pros

  • Adapt to multi-class classification problems
  • Simple and fast method to implement
  • Few parameters to define except the right kernel
  • Works well for datasets with many features

Cons

  • Time and memory consuming algorithm
  • Difficult to scale up with SVMs
  • Choice of kernel can sometimes be complicated
  • Very sensitive to outliers

Parameter choices strategy

To be efficient, SVM require some optimisation. I proceed by grid search, similarly to the k-NN models. The parameters considered are the regularisation value C, the kernel (linear, gaussian, polynomial or sigmoid), and the hyperparameter values gamma and degree associated to the kernels.

Because the search of best parameter with the grid search python function take a lot of time and the cost of fitting a SVM increases at least quadratically, I use only 1% of the train dataset for the grid search and the fit.

Lastly, as for the k-NN, SVM is sensitive to the scale of the variables, I standardize the features, making it possible to use a regular Euclidian distance for the algorithm

Build sample of train data

In [57]:
# build sample of train data
temp_data = train.sample(n=int(round(X_train.shape[0] * 0.01,0)), random_state=340)

X_train_reduce = temp_data.loc[:,best_selected_features]
y_train_reduce = temp_data.loc[:,["label_0"]]

Split data

In [58]:
# split into X_train, y_train, X_valid and y_valid
svm_X_train, svm_X_valid, svm_y_train, svm_y_valid = train_test_split(X_train_reduce, y_train_reduce, test_size=0.2)

Define hyperparameters and fit the model

In [59]:
%%time

# define the evaluation procedure
cv = RepeatedStratifiedKFold(n_splits=10, n_repeats=3, random_state=1)

# define grid
grid = {
    'C':[1, 1.5, 2, 2.5, 3], 
    'kernel':('linear', 'poly', 'rbf', 'sigmoid'),
    'gamma':['scale', "auto", 0.01, 0.1], 
    'degree': [2, 3, 4],
       }

# run grid search
svm = SVC();
grid_search = GridSearchCV(svm, grid, cv=cv, scoring=scoring_critetion, n_jobs = -1, verbose = True);
grid_search.fit(svm_X_train,svm_y_train);

# print best model
print(grid_search.best_estimator_)
Fitting 30 folds for each of 240 candidates, totalling 7200 fits
SVC(C=1.5, degree=2)
Wall time: 12min 34s

Compute FPR + FNR score

In [60]:
# compute FPR + FNR score
svm_y_pred = grid_search.predict(svm_X_valid)
valid_score = criterion(svm_y_pred, svm_y_valid)
print('FPR + FNR = {}'.format(valid_score))
FPR + FNR = 0.4673355478724606

To further inspect the performance

In [63]:
# to further inspect the performance:
CM = confusion_matrix(svm_y_valid, svm_y_pred)
TN, TP = CM[0, 0], CM[1, 1]
FP, FN = CM[0, 1], CM[1, 0]
print('Confusion Matrix: \n {}'.format(CM))
print('Accuracy: {}'.format((TP + TN) / (TP + TN + FP + FN)))  
print('False Positive Rate: {}'.format(FP / (FP + TN)))  
print('False Negative Rate: {}'.format(FN / (FN + TP)))
print('FPR + FNR = {}'.format(FP / (FP + TN) + FN / (FN + TP)))
Confusion Matrix: 
 [[238  60]
 [ 79 218]]
Accuracy: 0.7663865546218488
False Positive Rate: 0.20134228187919462
False Negative Rate: 0.265993265993266
FPR + FNR = 0.4673355478724606

Result discussion

First of all, before discussing the obtained result, we can see that the search for the best hyperparameters takes a lot of time (approx. 12min) and even if the model is trained only on 1% of the training dataset.

The $FPR$ and $FNR$ rate remains high : 0.46 but this is the best score I could get for this moment. Indeed $FPR$ and $FNR$ rate is slightly less than the rate obtained by k-NN. As Decision Tree we also notice with the confusion matrix that the false positives are in majority compared to the false negatives. The algorithm does does almost as much as random. Thus, not surprisingly, the performance obtained on the validation set is not that bad even if the algorithm is train on a tiny data set.

Bagging Approaches

Bagging is a technique that consists in assembling a large number of algorithms with low individual performance (shallow Decision Trees) to create a much more efficient one (Random Forest). The low performance algorithms are called "weak learners" and the result "strong learner". Weak learners can be of different kinds and have different performances, but they must be independent of each other. The assembly of "weak learners" (shrubs) into "strong learner" (forest) is done by voting. That is to say that each "weak learner" will emit an answer (a vote), and the prediction of the "strong learner" will be the average of all the emitted answers. In fact, bagging combines the "best" classifiers in a way that reduces their variance. Bagging is used with decision trees, where it greatly increases the stability of the models by improving accuracy and reducing variance, thus eliminating the challenge of overfitting.

4/ Random Forest Classifier

Algorithm Explanation

First of all, Random Forests represent a class of machine learning algorithms with solid performances in the family of ensemble learning. Random forests are therefore an improvement of the bagging for Decision Tree in order to make the trees used more independent (less correlated). The random forest is composed of several decision trees, working independently on a classification task. Each one produces an estimate, and it is the assembly of the decision trees and their analyses that will give a global estimate. In other words, it is a matter of drawing inspiration from different opinions, dealing with the same problem, to better understand it. Each model is randomly distributed to subsets of decision trees. The term random forest comes from the fact that the individual predictors are, here, explicitly predictors per tree, and from the fact that each tree depends on an additional random variable (i.e. in addition to $L_{n}$). A random forest is the aggregation of a collection of random trees.

image.png

The decision to classify point x is made by a Majority Vote:

$\hat{f}_{k}(\mathbf{x}) \in \underset{y \in \mathcal{Y}}{\arg \max }\left(\sum_{j=1}^{k} \mathbb{1}_{\left\{y_{r_{j}}=y\right\}}\right)$

Pros and Cons

Pros

Same as Decision Tree

Cons

Same as Decision Tree

Parameter choices strategy

I optimize my model with a grid search on the following parameters: the split criterion (Gini or entropy), the max depth of the trees (from 2 to 7). I voluntarily choose to focus on the search for parameters with a limited tree depth to avoid overfitting. In addition I also optimize the number of estimator parameters this corresponds to the number of Decision Tree used to make the prediction. Overall I optimize the parmeters by performing a grid search with the same parameters as those defined for the implementation of the Decision Tree algorithm.

In [78]:
# build sample of train data
temp_data = train_raw.sample(n=int(round(X_train.shape[0] * 0.05,0)), random_state=140)

X_train_reduce = temp_data.loc[:,best_selected_features]
y_train_reduce = temp_data.loc[:,["label_0"]]
In [79]:
# split into X_train, y_train, X_valid and y_valid
rf_X_train, rf_X_valid, rf_y_train, rf_y_valid = train_test_split(X_train_reduce, y_train_reduce, test_size=0.2)
In [80]:
%%time

# define the evaluation procedure
cv = RepeatedStratifiedKFold(n_splits=10, n_repeats=3, random_state=1)

# define grid
grid = {'criterion':('gini', 'entropy'),
        'n_estimators': [100, 110, 120],
        'max_depth':[2, 3, 4, 5, 6, 7],
       }


# run grid search
rf = RandomForestClassifier();
grid_search = GridSearchCV(rf, grid, cv=cv, scoring=scoring_critetion, n_jobs = -1);
grid_search.fit(rf_X_train,rf_y_train);


# print best model
print(grid_search.best_estimator_)
RandomForestClassifier(max_depth=7, n_estimators=120)
Wall time: 10min 28s
In [81]:
# compute FPR + FNR score
rf_y_pred = grid_search.predict(rf_X_valid)
valid_score = criterion(rf_y_pred, rf_y_valid)
print('FPR + FNR = {}'.format(valid_score))
FPR + FNR = 0.4946489384827186
In [82]:
# to further inspect the performance:
CM = confusion_matrix(rf_y_valid, rf_y_pred)
TN, TP = CM[0, 0], CM[1, 1]
FP, FN = CM[0, 1], CM[1, 0]
print('Confusion Matrix: \n {}'.format(CM))
print('Accuracy: {}'.format((TP + TN) / (TP + TN + FP + FN)))  
print('False Positive Rate: {}'.format(FP / (FP + TN)))  
print('False Negative Rate: {}'.format(FN / (FN + TP)))
print('FPR + FNR = {}'.format(FP / (FP + TN) + FN / (FN + TP)))
plt.figure(figsize=(6,4))
plt.grid()
rf_y_prob = grid_search.predict_proba(rf_X_valid)[:, 1]
fpr, tpr, thresholds = roc_curve(rf_y_valid, rf_y_prob, pos_label=1)
idx = np.argmin(fpr + (1-tpr))
plt.plot(fpr, 1-tpr, label='RF')
plt.plot(fpr[idx], (1-tpr)[idx], '+', color='k')
plt.legend(loc='best')
plt.xlabel('FPR')
plt.ylabel('FNR')
plt.show()
Confusion Matrix: 
 [[1129  363]
 [ 372 1108]]
Accuracy: 0.7526917900403769
False Positive Rate: 0.2432975871313673
False Negative Rate: 0.25135135135135134
FPR + FNR = 0.4946489384827186

Result discussion

The $FPR$ and $FNR$ rate is quite high: 0.49. The algorithm does as well than the random. But Random Forest is quite better on the classification task than Decision Tree (0.53). We also notice with the confusion matrix that the number of false positives and false negatives is almost similar. Thus, not surprisingly, the performance obtained on the validation set is quite bad.

5/ ExtraTrees Classifier

ExtraTrees and Random Forests are close methodologies. Indeed, the both algorithm have much in common. Both are composed of a large number of decision trees, where the final decision is obtained by considering the prediction of each tree and the class prediction is decided by a majority vote

However there are two differences which are as follows:

  • Random Forest uses bootstrap replicas , i.e., it subsamples the input data with replacement, while Extra Trees uses the entire original sample.
  • Another difference is the selection of cutpoints to split the nodes. Random Forest chooses the optimal split while Extra Trees chooses it randomly. However, once the split points are selected, both algorithms choose the best one among all the subsets of features. Therefore, ExtraTrees adds randomization but still has optimization.

Parameter choices strategy

I optimize my model with a grid search on the following parameters: the split criterion (Gini or entropy), the max depth of the trees (from 1 to 7). I voluntarily choose to focus on the search for parameters with a limited tree depth to avoid overfitting. Since Extra Tree and Random Forests adopt very similar approaches, I used the same optimization strategy to refine ExtraTrees learning so, my approach is thus similar to random forests.

In [83]:
# build sample of train data
temp_data = train_raw.sample(n=int(round(X_train.shape[0] * 0.1,0)), random_state=140)

X_train_reduce = temp_data.loc[:,best_selected_features]
y_train_reduce = temp_data.loc[:,["label_0"]]
In [84]:
# split into X_train, y_train, X_valid and y_valid
etr_X_train, etr_X_valid, etr_y_train, etr_y_valid = train_test_split(X_train_reduce, y_train_reduce, test_size=0.2)
In [85]:
%%time

# define the evaluation procedure
cv = RepeatedStratifiedKFold(n_splits=10, n_repeats=3, random_state=1)

# define grid
grid = {'criterion':('gini', 'entropy'),
        'n_estimators': [100, 110, 120],
        'max_depth':[2, 3, 4, 5, 6, 7],
       }


# run grid search
etr = ExtraTreesClassifier();
grid_search = GridSearchCV(etr, grid, cv=cv, scoring=scoring_critetion, n_jobs = -1);
grid_search.fit(etr_X_train,etr_y_train);

# print best model
print(grid_search.best_estimator_)
ExtraTreesClassifier(max_depth=7, n_estimators=120)
Wall time: 6min 5s
In [86]:
etr_y_pred = grid_search.predict(etr_X_valid)
valid_score = criterion(etr_y_pred, etr_y_valid)
print('FPR + FNR = {}'.format(valid_score))
FPR + FNR = 0.5554041465536488
In [87]:
# to further inspect the performance:
CM = confusion_matrix(etr_y_valid, etr_y_pred)
TN, TP = CM[0, 0], CM[1, 1]
FP, FN = CM[0, 1], CM[1, 0]
print('Confusion Matrix: \n {}'.format(CM))
print('Accuracy: {}'.format((TP + TN) / (TP + TN + FP + FN)))  
print('False Positive Rate: {}'.format(FP / (FP + TN)))  
print('False Negative Rate: {}'.format(FN / (FN + TP)))
print('FPR + FNR = {}'.format(FP / (FP + TN) + FN / (FN + TP)))
plt.figure(figsize=(6,4))
plt.grid()
etr_y_prob = grid_search.predict_proba(etr_X_valid)[:, 1]
fpr, tpr, thresholds = roc_curve(etr_y_valid, etr_y_prob, pos_label=1)
idx = np.argmin(fpr + (1-tpr))
plt.plot(fpr, 1-tpr, label='RF')
plt.plot(fpr[idx], (1-tpr)[idx], '+', color='k')
plt.legend(loc='best')
plt.xlabel('FPR')
plt.ylabel('FNR')
plt.show()
Confusion Matrix: 
 [[2278  680]
 [ 972 2014]]
Accuracy: 0.7220726783310901
False Positive Rate: 0.22988505747126436
False Negative Rate: 0.32551908908238447
FPR + FNR = 0.5554041465536488

Result discussion

The $FPR$ and $FNR$ rate is quite high: 0.55. The algorithm does worse than the random. But Random Forest(0.49) is better on the classification task than ExtraTree. we also notice with the confusion matrix that the false positives are in great majority compared to the false negatives. Thus, not surprisingly, the performance obtained on the validation set is quite bad.

Boosting Approaches

The principle of boosting is to combine the outputs of several weak classifiers to obtain a much more accurate prediction (strong classifier).

The boosting method is used to decrease the bias. Each weak classifier is weighted by the quality of its classification: the better it classifies, the more important it will be. The poorly classified examples will have a greater weight (we say they are boosted) towards the weak learner in the next round, so that it makes up for the lack.

In fact, after analyzing the first tree, they increase the weight of each observation that the model fails to classify correctly. On the other hand, they decrease the weight of those whose classification is not a problem. The original idea is to improve the predictions of the first tree. In fact the goal is to correct the shortcomings of the previous tree.

6/ Adaboost

Algorithm Explanation

The "weak learners" of AdaBoost are generally decision trees with only 2 branches and 2 leaves (also called stumps) but we can use other types of classifiers. Here are the steps to build the first "weak learner" that we will call $w_{1}$ :

  • We assign the same weight to each row of the dataset
  • We train the first weak learner in order to maximize the number of correct answers
  • We give a score to w1 according to its performance

The score will allow us to determine the weight to be given to which "weak learner" at the time of the final vote.

Moreover, we want the next weak learner to be able to correct the mistakes of the previous one. To do this, we will increase the weight of the lines on which the first weak learner was wrong, and decrease those on which the first weak learner was right. Here are the steps to build the other "weak learner":

  • We modify the weights assigned to the lines according to the errors of the last weak learner
  • We train a "weak learner" to maximize the number of correct answers on the lines with high weights
  • We give a score to this "weak learner" according to its performance

Contrary to $w_{1}$, the next "weak learner" will take into account the weights assigned to the lines. The higher the weight of a line, the more important it is for the weak learner to classify this line correctly, and inversely.

Parameter choices strategy

To apply the algorithm, I chose to use a Decision Tree classifier. I optimized the parameters of the algorithm by varying the number of estimators (i.e. the number of weak learners such as Decision Tree) as well as the learning rate. In addition I decided to choose the best hyperparameters selected by the grid serach of my Decision Tree algorithm. But by iterating the model several times I could see that a smaller tree depth generally led to better performance so I set the depth of the decision tree to 2 (max_depth = 2)

Build sample of train data

In [128]:
# build sample of train data
temp_data = train_raw.sample(n=int(round(X_train.shape[0] * 0.05,0)), random_state=140)

X_train_reduce = temp_data.loc[:,best_selected_features]
y_train_reduce = temp_data.loc[:,["label_0"]]

Split data

In [129]:
# split into X_train, y_train, X_valid and y_valid
adb_tree_X_train, adb_tree_X_valid, adb_tree_y_train, adb_tree_y_valid = train_test_split(X_train_reduce, y_train_reduce, test_size=0.2)

Define hyperparameters and fit the model

In [130]:
%%time

# define the evaluation procedure
cv = RepeatedStratifiedKFold(n_splits=10, n_repeats=3, random_state=1)

# define grid
grid = {'learning_rate': [0.01, 0.1, 1.0],
        'n_estimators': [100, 110, 120],
       }

# run grid search
adb_tree = AdaBoostClassifier(DecisionTreeClassifier(ccp_alpha=0.001, criterion='entropy', max_depth=2), random_state=0)
grid_search = GridSearchCV(adb_tree, grid, cv=cv, scoring=scoring_critetion, n_jobs = -1);

# fit on train set
grid_search.fit(adb_tree_X_train, adb_tree_y_train)

# print best model
print(grid_search.best_estimator_)
AdaBoostClassifier(base_estimator=DecisionTreeClassifier(ccp_alpha=0.001,
                                                         criterion='entropy',
                                                         max_depth=2),
                   learning_rate=0.1, n_estimators=100, random_state=0)
Wall time: 9min 30s

Compute FPR + FNR score

In [131]:
adb_tree_y_pred = grid_search.predict(adb_tree_X_valid)
valid_score = criterion(adb_tree_y_pred, adb_tree_y_valid)
print('FPR + FNR = {}'.format(valid_score))
FPR + FNR = 0.5198994971233135

To further inspect the performance

In [132]:
# to further inspect the performance:
CM = confusion_matrix(adb_tree_y_valid, adb_tree_y_pred)
TN, TP = CM[0, 0], CM[1, 1]
FP, FN = CM[0, 1], CM[1, 0]
print('Confusion Matrix: \n {}'.format(CM))
print('Accuracy: {}'.format((TP + TN) / (TP + TN + FP + FN)))  
print('False Positive Rate: {}'.format(FP / (FP + TN)))  
print('False Negative Rate: {}'.format(FN / (FN + TP)))
print('FPR + FNR = {}'.format(FP / (FP + TN) + FN / (FN + TP)))
plt.figure(figsize=(6,4))
plt.grid()
adb_tree_y_prob = grid_search.predict_proba(adb_tree_X_valid)[:, 1]
fpr, tpr, thresholds = roc_curve(adb_tree_y_valid, adb_tree_y_prob, pos_label=1)
idx = np.argmin(fpr + (1-tpr))
plt.plot(fpr, 1-tpr, label='RF')
plt.plot(fpr[idx], (1-tpr)[idx], '+', color='k')
plt.legend(loc='best')
plt.xlabel('FPR')
plt.ylabel('FNR')
plt.show()
Confusion Matrix: 
 [[1124  352]
 [ 421 1075]]
Accuracy: 0.7399057873485868
False Positive Rate: 0.23848238482384823
False Negative Rate: 0.28141711229946526
FPR + FNR = 0.5198994971233135

Result discussion

With the second approach I managed to get the rate of $FPR$ and $FNR$ down to 0.51. So, not surprisingly, the performances obtained on the validation set are much better because this time I used all the dataset to train my model. Also increasing the number of estimators seems to improve the performances. Surprisingly the boosting algorithms don't provide much better performance than the bagging algorithms and the classical approaches such as SVM, k-NN. But it is still more efficient thanDecision Tree

7/ Gradient Boosting

Algorithm Explanation

The Gradent Boosting algorithm has a lot in common with Adaboost. Like Adaboost, it is a set of weak learners, created one after the other, forming a strong learner. Moreover, each weak learner is trained to correct the mistakes of the previous weak learners. However, unlike Adaboost, all weak learners have equal weight in the voting system, regardless of their performance. The first weak learner ($w_{1}$) is very basic, it is simply the average of the observations. It is therefore not very efficient, but it will serve as a basis for the rest of the algorithm. Afterwards, we compute the difference between this average and the reality, which we call the first residual. In general, we will call the difference between the prediction of the algorithm and the reality, i.e. the expected value. The particularity of Gradient Boosting is that it tries to predict at each step not the data itself but the residues. Thus, the second "weak learner" is trained to predict the first residual. The predictions of the second weak learner are then multiplied by a factor less than 1.

The idea behind this multiplication is that several small steps are more accurate than a few large steps. The multiplication therefore reduces the size of the "steps" to increase the accuracy. The objective is to "move" the predictions of the model away from the mean, little by little, to bring them closer to reality. From this moment, the creation of the weak learners always follows the same pattern:

  • From the last predictions, we calculate the new residuals (difference between reality and the prediction)
  • Train the new "weak learner" to predict these residues
  • Multiply the predictions of this "weak learner" by a factor less than 1
  • Obtain new predictions, often slightly better than the previous ones

Parameter choices strategy

I follow a usual grid search approach for the main parameters of interest: the learning rate, the maximum depth, and the maximum number of features retained.

Build sample of train data

In [143]:
# build sample of train data
temp_data = train_raw.sample(n=int(round(X_train.shape[0] * 0.05,0)), random_state=140)

X_train_reduce = temp_data.loc[:,best_selected_features]
y_train_reduce = temp_data.loc[:,["label_0"]]

Split data

In [144]:
# split into X_train, y_train, X_valid and y_valid
gb_X_train, gb_X_valid, gb_y_train, gb_y_valid = train_test_split(X_train_reduce, y_train_reduce, test_size=0.2)

Define hyperparameters and fit the model

In [145]:
%%time

cv = RepeatedStratifiedKFold(n_splits=10, n_repeats=3, random_state=1)

# define grid
grid = {'learning_rate':[0.2, 0.3],
        'max_depth':[1, 2, 3, 4, 6],
        'max_features':['sqrt', "log2", 0.2]}

# run grid search
gb = GradientBoostingClassifier()
grid_search = GridSearchCV(gb, grid, cv=cv, scoring=criterion_GridCV, n_jobs = -1);
grid_search.fit(gb_X_train, gb_y_train);

print(grid_search.best_estimator_)
GradientBoostingClassifier(learning_rate=0.2, max_depth=1, max_features='sqrt')
Wall time: 6min 32s

Compute FPR + FNR score

In [146]:
gb_y_pred = grid_search.predict(gb_X_valid)
valid_score = criterion(gb_y_pred, gb_y_valid)
print('FPR + FNR = {}'.format(valid_score))
FPR + FNR = 0.5420536789949002

To further inspect the performance

In [147]:
# to further inspect the performance:
CM = confusion_matrix(gb_y_valid, gb_y_pred)
TN, TP = CM[0, 0], CM[1, 1]
FP, FN = CM[0, 1], CM[1, 0]
print('Confusion Matrix: \n {}'.format(CM))
print('Accuracy: {}'.format((TP + TN) / (TP + TN + FP + FN)))  
print('False Positive Rate: {}'.format(FP / (FP + TN)))  
print('False Negative Rate: {}'.format(FN / (FN + TP)))
print('FPR + FNR = {}'.format(FP / (FP + TN) + FN / (FN + TP)))
plt.figure(figsize=(6,4))
plt.grid()
gb_y_prob = grid_search.predict_proba(gb_X_valid)[:, 1]
fpr, tpr, thresholds = roc_curve(gb_y_valid, gb_y_prob, pos_label=1)
idx = np.argmin(fpr + (1-tpr))
plt.plot(fpr, 1-tpr, label='RF')
plt.plot(fpr[idx], (1-tpr)[idx], '+', color='k')
plt.legend(loc='best')
plt.xlabel('FPR')
plt.ylabel('FNR')
plt.show()
Confusion Matrix: 
 [[1049  421]
 [ 384 1118]]
Accuracy: 0.7291386271870794
False Positive Rate: 0.2863945578231292
False Negative Rate: 0.255659121171771
FPR + FNR = 0.5420536789949002

Result discussion

The $FPR$ and $FNR$ rate is quite high: 0.54. The algorithm does worse than the random.

8/ XGBoost

Algorithm Explanation

XGBoost is an improved version of the Gradient Boosting algorithm. Indeed, it relies on a set of "weak learners" who predict the residuals, and correct the errors of the previous "weak learners".

The main difference between XGBoost and other implementations of the Gradient Boosting method is that XGBoost is computationally optimized to make the various computations required to apply Gradient Boosting faster. Specifically, XGBoost processes data in multiple compressed blocks allowing for much faster sorting and parallel processing.

But the advantages of XGBoost are not only linked to the implementation of the algorithm, and thus to its performance, but also to the various parameters it offers. Indeed XGBoost offers a panel of very important hyperparameters; it is thus possible thanks to this diversity of parameters, to have a total control on the implementation of Gradient Boosting It is also possible to add different regularizations to the loss function, limiting a phenomenon that happens quite often when using gradient boosting algorithms: overfitting.

Parameter choices strategy

For the choice of the parameters of this algorithm I took the decision to find the best parameters of the algorithm manually in order to be able to keep the hand on the refinement of the model which is not the case with a grid search.

The strategy I have implemented is as follows:

1/ Firstly I intialize the hyperparameters of the model by filling reasonable values for key inputs:

  • learning_rate: 0.01
  • n_estimators: 100 because I train the model on the entire train dataset
  • max_depth: 3
  • subsample: 0.8
  • colsample_bytree: 1
  • gamma: 1
  • objective='binary:logistic'

I use logistic regression for binary classification as objective function, because it is the most suitable objective function for the binary classification task of this data challenge

2/ Run model.fit(eval_set, eval_metric) and diagnose the first run, specifically the n_estimators parameter. After several iterations I noticed that beyond 103 estimators the performance drops. So I set the parameter to 103 before varying the other most important parameters of XGBoost and analyze their influence on the performance of the model

3/ Optimize max_depth parameter. It represents the depth of each tree, which is the maximum number of different features used in each tree. To find the best value of this parameter I firstly chose going from a low max_depth (3 for instance) and then increasing it incrementally by 1, and stopping when there’s no performance gain of increasing it. It is necessary to handle this parameter with care, because if this parameter is too high it can lead to overfitting the model.

4/ Try different value of learning rate and the features that avoids overfitting:

  • learning_rate: A lower learning rate can increase the prediction performance but increase the training time of the algorithm. I chose a learning rate at 0.3 because after several iterations of the algorithm it is the one that offers the best compromise performance/training time.
  • subsample, which is for each tree the % of rows taken to build the tree. I choose the default parameter which is 1 because not taking out too many rows, as performance will drop a lot.

  • colsample_bytree: It is the number of columns used by each tree. I set this parameter because at 1 I use a few features of my dataset.

  • gamma: It acts as a regularization parameter. I use the default parameter because in my case I found that changing the value of this parmeter did not influence the performance

1st XGBoost Approach

Build sample of train data

In [74]:
# # build sample of train data
temp_data = train_raw.sample(n=int(round(X_train.shape[0] * 1,0)), random_state=140)

X_train_reduce = temp_data.loc[:,best_selected_features]
y_train_reduce = temp_data.loc[:,["label_0"]]

Split data

In [75]:
# split into X_train, y_train, X_valid and y_valid
xgbc_X_train, xgbc_X_valid, xgbc_y_train, xgbc_y_valid = train_test_split(X_train_reduce, y_train_reduce, test_size=0.2, random_state=122, stratify=y_df_cleaned)

Define hyperparameters and fit the model

In [76]:
%%time

# fit model no training data
xgbc = XGBClassifier(booster='gbtree', learning_rate=0.3, 
                     max_depth=6, n_estimators=103, 
                     colsample_bynode=1, colsample_bytree=1,
                     subsample=1, gamma=0, objective='binary:logistic')
xgbc.fit(xgbc_X_train, xgbc_y_train)
Wall time: 44.6 s
Out[76]:
XGBClassifier(base_score=0.5, booster='gbtree', colsample_bylevel=1,
              colsample_bynode=1, colsample_bytree=1, enable_categorical=False,
              gamma=0, gpu_id=-1, importance_type=None,
              interaction_constraints='', learning_rate=0.3, max_delta_step=0,
              max_depth=6, min_child_weight=1, missing=nan,
              monotone_constraints='()', n_estimators=103, n_jobs=8,
              num_parallel_tree=1, predictor='auto', random_state=0,
              reg_alpha=0, reg_lambda=1, scale_pos_weight=1, subsample=1,
              tree_method='exact', validate_parameters=1, verbosity=None)

Compute FPR + FNR score

In [77]:
xgbc_y_pred = xgbc.predict(xgbc_X_valid)
valid_score = criterion(xgbc_y_pred, xgbc_y_valid)
print('FPR + FNR = {}'.format(valid_score))
FPR + FNR = 0.4495025716541925

To further inspect the performance

In [79]:
# to further inspect the performance:
CM = confusion_matrix(xgbc_y_valid, xgbc_y_pred)
TN, TP = CM[0, 0], CM[1, 1]
FP, FN = CM[0, 1], CM[1, 0]
print('Confusion Matrix: \n {}'.format(CM))
print('Accuracy: {}'.format((TP + TN) / (TP + TN + FP + FN)))  
print('False Positive Rate: {}'.format(FP / (FP + TN)))  
print('False Negative Rate: {}'.format(FN / (FN + TP)))
print('FPR + FNR = {}'.format(FP / (FP + TN) + FN / (FN + TP)))
plt.figure(figsize=(6,4))
plt.grid()
gb_y_prob = xgbc.predict_proba(xgbc_X_valid)[:, 1]
fpr, tpr, thresholds = roc_curve(xgbc_y_valid, gb_y_prob, pos_label=1)
idx = np.argmin(fpr + (1-tpr))
plt.plot(fpr, 1-tpr, label='RF')
plt.plot(fpr[idx], (1-tpr)[idx], '+', color='k')
plt.legend(loc='best')
plt.xlabel('FPR')
plt.ylabel('FNR')
plt.show()
Confusion Matrix: 
 [[23575  5955]
 [ 7412 22494]]
Accuracy: 0.775102631401844
False Positive Rate: 0.2016593294954284
False Negative Rate: 0.24784324215876413
FPR + FNR = 0.4495025716541925

Result discussion

The $FPR$ and $FNR$ rate is lower that I obtained with XGBoost for this data challenge: 0.45. Not surprisingly, the performance of XGBoost are the best.

2nd XGBoost Approach

It seems that the XGBoost model is the model that gives the best performance. The idea is to improve the performance of 1st XGBoost Approach model. To do this, I want to see if training the model on the whole training dataset gives better results and then fine-tune the hyperparameters of the model.

Let's try to run XGBoost on the entire dataset without taken into account the features selected.

Split whole dataset

In [65]:
# split into X_train, y_train, X_valid and y_valid
xgbc_X_train, xgbc_X_valid, xgbc_y_train, xgbc_y_valid = train_test_split(X_dataframe, y_dataframe, test_size=0.2, random_state=12)
In [66]:
%%time

# fit model
xgbc = XGBClassifier(booster='gbtree', learning_rate=0.3, 
                     max_depth=6, n_estimators=103, 
                     colsample_bynode=1, colsample_bytree=1,
                     subsample=1, gamma=0, objective='binary:logistic')

xgbc.fit(xgbc_X_train, xgbc_y_train)
Wall time: 2min 15s
Out[66]:
XGBClassifier(base_score=0.5, booster='gbtree', colsample_bylevel=1,
              colsample_bynode=1, colsample_bytree=1, enable_categorical=False,
              gamma=0, gpu_id=-1, importance_type=None,
              interaction_constraints='', learning_rate=0.3, max_delta_step=0,
              max_depth=6, min_child_weight=1, missing=nan,
              monotone_constraints='()', n_estimators=103, n_jobs=8,
              num_parallel_tree=1, predictor='auto', random_state=0,
              reg_alpha=0, reg_lambda=1, scale_pos_weight=1, subsample=1,
              tree_method='exact', validate_parameters=1, verbosity=None)

Define hyperparameters and fit the model

In [67]:
xgbc_y_pred = xgbc.predict(xgbc_X_valid)
valid_score = criterion(xgbc_y_pred, xgbc_y_valid)
print('FPR + FNR = {}'.format(valid_score))
FPR + FNR = 0.400374619529163

To further inspect the performance

In [68]:
# to further inspect the performance:
CM = confusion_matrix(xgbc_y_valid, xgbc_y_pred)
TN, TP = CM[0, 0], CM[1, 1]
FP, FN = CM[0, 1], CM[1, 0]
print('Confusion Matrix: \n {}'.format(CM))
print('Accuracy: {}'.format((TP + TN) / (TP + TN + FP + FN)))  
print('False Positive Rate: {}'.format(FP / (FP + TN)))  
print('False Negative Rate: {}'.format(FN / (FN + TP)))
print('FPR + FNR = {}'.format(FP / (FP + TN) + FN / (FN + TP)))
plt.figure(figsize=(6,4))
plt.grid()
gb_y_prob = xgbc.predict_proba(xgbc_X_valid)[:, 1]
fpr, tpr, thresholds = roc_curve(xgbc_y_valid, gb_y_prob, pos_label=1)
idx = np.argmin(fpr + (1-tpr))
plt.plot(fpr, 1-tpr, label='RF')
plt.plot(fpr[idx], (1-tpr)[idx], '+', color='k')
plt.legend(loc='best')
plt.xlabel('FPR')
plt.ylabel('FNR')
plt.show()
Confusion Matrix: 
 [[24238  5471]
 [ 6430 23308]]
Accuracy: 0.7998048682019278
False Positive Rate: 0.1841529502844256
False Negative Rate: 0.21622166924473737
FPR + FNR = 0.400374619529163

Result discussion

The $FPR$ and $FNR$ rate is the slower for this notebook: 0.40. Indeed, I managed to greatly improve the classification performance of my model by using all the features of the dataset. This is a very surprising thing. It means that training the model on the dataset with a limited number of features reduces the performances. Therefore, the variable selection for this data challenge does not seem to improve the performance of the models.

Note

As we have just seen with the implementation of the XGBoost model, training the model on the whole dataset at our disposal leads to better performance. Therefore, for the rest of this notebook, I will directly train the next models on the whole dataset.

Moreover, we could notice that it is the boosting algorithms that provide the best performances for the moment. There are other boosting algorithms which are an improvement of the XGBoost model like LightGBM and CatBoost. It is therefore necessary to test the performance of these models on our dataset

9/ LightGBM

Algorithm Explanation

Similar to XGBoost, LightGBM developped by Microsoft is a high-performance distributed framework that uses decision trees for ranking, classification and regression tasks. LightGBM is significantly faster than XGBoost but delivers almost equivalent performance. Faster training speed and accuracy resulting from LightGBM being a histogram-based algorithm that performs bucketing of values that also requires less memory. One of the strong points of is that LightGBM is also compatible with large and complex datasets but is much faster during training.

In contrast to the level-wise (horizontal) growth in XGBoost, LightGBM carries out leaf-wise (vertical) growth that results in more loss reduction and in turn higher accuracy while being faster. But this may also result in overfitting on the training data which could be handled using the max-depth parameter that specifies where the splitting would occur. Hence, XGBoost is capable of building more robust models than LightGBM.

Parameter choices strategy

As I said previously, Since LightGBM is similar to XGBoost, these two models share broadly the same parameters. Therefore, I used the same strategy for parameter refinement as the one used when implementing the XGBoost model

Hence, since LightGBM is similar to XGBoost, these two models share broadly the same parameters. Therefore, I used the same strategy for parameter refinement as the one used when implementing the XGBoost model. First, I initialize the model hyperparameters by filling in reasonable values for the key inputs :

  • learning_rate: 0.3
  • n_estimators: 100 because I train the model on the entire train dataset
  • max_depth: 3
  • subsample: 0.8
  • colsample_bytree: 1
  • objective='binary'

Then I refined the parameters by hand instead of using a grid search because I wanted to keep the total control on the optimization of the parameters (max_depth, learning_rate, n_estimators,...) by modifying very finely each parameter on a case by case basis to obtain the best model for our classification task.

As a result, for the optimal choice of hyperparameters for the LightGBM algorithm, I used a lower learning rate than I set for the XGBoost model and a much larger number of estimators. This can be explained in part by the difference in strategy between XGBoost and LighGBM regarding the level-wise growth strategy that I mentioned before.

But globally the best parameters I used are very similar to those I used for XGBoost (same max_depth value, same colsample_bynode value, colsample_bytree value, subsample value). This confirms the choices I made during the XGBoost algorithm since the models are globally based on the same operation

Split data

In [82]:
lgbc_X_train, lgbc_X_valid, lgbc_y_train, lgbc_y_valid = train_test_split(X_dataframe, y, test_size=0.2, random_state=34)

Define hyperparameters and fit the model

In [83]:
%%time

# fit model no training data
lgbc = LGBMClassifier(objective= 'binary', learning_rate=0.1, n_estimators = 2000, 
                      max_depth=6, colsample_bynode=1, colsample_bytree=1, 
                      subsample=1)

lgbc.fit(lgbc_X_train, lgbc_y_train, verbose=True)
Wall time: 1min 19s
Out[83]:
LGBMClassifier(colsample_bynode=1, colsample_bytree=1, max_depth=6,
               n_estimators=2000, objective='binary', subsample=1)

Compute FPR + FNR score

In [84]:
lgbc_y_pred = lgbc.predict(lgbc_X_valid)
valid_score = criterion(lgbc_y_pred, lgbc_y_valid)
print('FPR + FNR = {}'.format(valid_score))
FPR + FNR = 0.39082887665052807

To further inspect the performance

In [85]:
# to further inspect the performance:
CM = confusion_matrix(lgbc_y_valid, lgbc_y_pred)
TN, TP = CM[0, 0], CM[1, 1]
FP, FN = CM[0, 1], CM[1, 0]
print('Confusion Matrix: \n {}'.format(CM))
print('Accuracy: {}'.format((TP + TN) / (TP + TN + FP + FN)))  
print('False Positive Rate: {}'.format(FP / (FP + TN)))  
print('False Negative Rate: {}'.format(FN / (FN + TP)))
print('FPR + FNR = {}'.format(FP / (FP + TN) + FN / (FN + TP)))
plt.figure(figsize=(6,4))
plt.grid()
gb_y_prob = lgbc.predict_proba(lgbc_X_valid)[:, 1]
fpr, tpr, thresholds = roc_curve(lgbc_y_valid, gb_y_prob, pos_label=1)
idx = np.argmin(fpr + (1-tpr))
plt.plot(fpr, 1-tpr, label='RF')
plt.plot(fpr[idx], (1-tpr)[idx], '+', color='k')
plt.legend(loc='best')
plt.xlabel('FPR')
plt.ylabel('FNR')
plt.show()
Confusion Matrix: 
 [[24255  5388]
 [ 6231 23573]]
Accuracy: 0.8045485894998906
False Positive Rate: 0.1817629794555207
False Negative Rate: 0.20906589719500737
FPR + FNR = 0.39082887665052807

Result discussion

The $FPR$ and $FNR$ rate is the lower for this notebook: 0.39. LightGBM outperforms XGBoost. Maybe it's just because I'm taking advantage of LightGBM's improvements or simply because of a better choice of hyperparameters or both at the same time. In any case, LightGBM is the best model to perform the classification task for this data challenge

10/ CatBoost

Algorithm Explanation

CatBoost is an open-source machine learning (gradient boosting) algorithm, whose name comes from "Category" and "Boosting".

CatBoost builds symmetric (balanced) trees, unlike XGBoost and LightGBM. At each step, the leaves of the previous tree are split using the same condition. The feature-split pair that represents the lowest loss is selected and used for all nodes in the tier. This balanced tree architecture facilitates efficient processor implementation, reduces prediction time, makes model applicators fast, and controls overfitting as the structure serves as a regularization.

Classic boosting algorithms are prone to overfitting on small/noisy data sets due to a problem known as prediction lag. When computing the gradient estimate of a data instance, these algorithms use the same data instances with which the model was built, thus having no chance of encountering unseen data. CatBoost, on the other hand, uses the concept of ordered boosting, a permutation-based approach to train the model on one subset of the data while computing residuals on another subset, thus preventing target leakage and overfitting.

Globally, CatBoost is based on the same principle, namely the boosting technique, but integrates new approaches that allow it to be in some cases more efficient than XGBoost and LightGBM, both in terms of prediction time and accuracy of the generated prediction.

Parameter choices strategy

As I said previously, Since CatBoost is similar to XGBoost and LightGBM, these two models share broadly the same parameters. Therefore, I used the same strategy for parameter refinement as the one used when implementing the XGBoost model

Hence, since LightGBM is similar to XGBoost, these two models share broadly the same parameters. Therefore, I used the same strategy for parameter refinement as the one used when implementing the XGBoost model. First, I initialize the model hyperparameters by filling in reasonable values for the key inputs :

  • learning_rate: 0.3
  • iteration: 1000 equivalent to n_estimators for XGBoost and LightGBM
  • subsample: 1
  • eval_metric='LogLoss' similar to objective:binary:logistic for XGBoost

Then I refined the parameters by hand instead of using a grid search because I wanted to keep the total control on the optimization of the parameters (learning_rate, iteration, subsample,...) by modifying very finely each parameter on a case by case basis to obtain the best model for our classification task.

As a result, for the optimal choice of hyperparameters for the LightGBM algorithm, I used the same learning rate than I set for the LightGBM model and a much larger number of estimators than XGBoost but less than LightGBM. Moreover I set a lower value for the subsample hyperparameter than I set for LightGBM and XGBoost.

Split whole dataset

In [86]:
catgbc_X_train, catgbc_X_valid, catgbc_y_train, catgbc_y_valid = train_test_split(X_dataframe, y, test_size=0.2, random_state=12)

Define hyperparameters and fit the mode

In [87]:
%%time

# fit model no training data
catgbc = CatBoostClassifier(eval_metric= 'Logloss', iterations= 1500, 
                            learning_rate= 0.1, subsample= 0.8)

catgbc.fit(catgbc_X_train, catgbc_y_train, verbose=True)
0:	learn: 0.6650869	total: 329ms	remaining: 8m 13s
1:	learn: 0.6411800	total: 486ms	remaining: 6m 4s
2:	learn: 0.6217118	total: 633ms	remaining: 5m 15s
3:	learn: 0.6048473	total: 858ms	remaining: 5m 21s
4:	learn: 0.5923338	total: 1.04s	remaining: 5m 9s
5:	learn: 0.5807152	total: 1.21s	remaining: 5m
6:	learn: 0.5709959	total: 1.36s	remaining: 4m 50s
7:	learn: 0.5621674	total: 1.6s	remaining: 4m 58s
8:	learn: 0.5539617	total: 1.79s	remaining: 4m 56s
9:	learn: 0.5481931	total: 1.94s	remaining: 4m 49s
10:	learn: 0.5421322	total: 2.15s	remaining: 4m 50s
11:	learn: 0.5363346	total: 2.4s	remaining: 4m 57s
12:	learn: 0.5319680	total: 2.65s	remaining: 5m 3s
13:	learn: 0.5275236	total: 2.92s	remaining: 5m 10s
14:	learn: 0.5238307	total: 3.15s	remaining: 5m 11s
15:	learn: 0.5207628	total: 3.39s	remaining: 5m 14s
16:	learn: 0.5173554	total: 3.62s	remaining: 5m 15s
17:	learn: 0.5139151	total: 3.86s	remaining: 5m 18s
18:	learn: 0.5111380	total: 4.1s	remaining: 5m 19s
19:	learn: 0.5086079	total: 4.29s	remaining: 5m 17s
20:	learn: 0.5065998	total: 4.43s	remaining: 5m 11s
21:	learn: 0.5038679	total: 4.59s	remaining: 5m 8s
22:	learn: 0.5017601	total: 4.76s	remaining: 5m 5s
23:	learn: 0.4998001	total: 4.94s	remaining: 5m 3s
24:	learn: 0.4977284	total: 5.12s	remaining: 5m 2s
25:	learn: 0.4944583	total: 5.32s	remaining: 5m 1s
26:	learn: 0.4929256	total: 5.52s	remaining: 5m 1s
27:	learn: 0.4915505	total: 5.7s	remaining: 4m 59s
28:	learn: 0.4900334	total: 5.88s	remaining: 4m 58s
29:	learn: 0.4886685	total: 6.04s	remaining: 4m 56s
30:	learn: 0.4871681	total: 6.22s	remaining: 4m 54s
31:	learn: 0.4860732	total: 6.38s	remaining: 4m 52s
32:	learn: 0.4847305	total: 6.56s	remaining: 4m 51s
33:	learn: 0.4838905	total: 6.7s	remaining: 4m 48s
34:	learn: 0.4829251	total: 6.84s	remaining: 4m 46s
35:	learn: 0.4818192	total: 7.01s	remaining: 4m 45s
36:	learn: 0.4803672	total: 7.2s	remaining: 4m 44s
37:	learn: 0.4795256	total: 7.39s	remaining: 4m 44s
38:	learn: 0.4787581	total: 7.58s	remaining: 4m 43s
39:	learn: 0.4769125	total: 7.76s	remaining: 4m 43s
40:	learn: 0.4759171	total: 7.93s	remaining: 4m 42s
41:	learn: 0.4752232	total: 8.11s	remaining: 4m 41s
42:	learn: 0.4744690	total: 8.28s	remaining: 4m 40s
43:	learn: 0.4736836	total: 8.43s	remaining: 4m 39s
44:	learn: 0.4729862	total: 8.57s	remaining: 4m 37s
45:	learn: 0.4722573	total: 8.72s	remaining: 4m 35s
46:	learn: 0.4713801	total: 8.91s	remaining: 4m 35s
47:	learn: 0.4710223	total: 9.06s	remaining: 4m 34s
48:	learn: 0.4694440	total: 9.21s	remaining: 4m 32s
49:	learn: 0.4683829	total: 9.39s	remaining: 4m 32s
50:	learn: 0.4678928	total: 9.56s	remaining: 4m 31s
51:	learn: 0.4669539	total: 9.7s	remaining: 4m 29s
52:	learn: 0.4658610	total: 9.85s	remaining: 4m 28s
53:	learn: 0.4651373	total: 9.99s	remaining: 4m 27s
54:	learn: 0.4640610	total: 10.1s	remaining: 4m 26s
55:	learn: 0.4637025	total: 10.3s	remaining: 4m 25s
56:	learn: 0.4631287	total: 10.5s	remaining: 4m 24s
57:	learn: 0.4621569	total: 10.6s	remaining: 4m 23s
58:	learn: 0.4618297	total: 10.8s	remaining: 4m 22s
59:	learn: 0.4609509	total: 10.9s	remaining: 4m 21s
60:	learn: 0.4604425	total: 11s	remaining: 4m 20s
61:	learn: 0.4596328	total: 11.2s	remaining: 4m 19s
62:	learn: 0.4591461	total: 11.4s	remaining: 4m 19s
63:	learn: 0.4583282	total: 11.5s	remaining: 4m 18s
64:	learn: 0.4578251	total: 11.7s	remaining: 4m 18s
65:	learn: 0.4573729	total: 11.8s	remaining: 4m 17s
66:	learn: 0.4569968	total: 12s	remaining: 4m 16s
67:	learn: 0.4564474	total: 12.1s	remaining: 4m 15s
68:	learn: 0.4560180	total: 12.3s	remaining: 4m 15s
69:	learn: 0.4555947	total: 12.5s	remaining: 4m 14s
70:	learn: 0.4551849	total: 12.6s	remaining: 4m 13s
71:	learn: 0.4540379	total: 12.8s	remaining: 4m 14s
72:	learn: 0.4537562	total: 12.9s	remaining: 4m 13s
73:	learn: 0.4534386	total: 13.1s	remaining: 4m 12s
74:	learn: 0.4528139	total: 13.3s	remaining: 4m 12s
75:	learn: 0.4520733	total: 13.5s	remaining: 4m 13s
76:	learn: 0.4517739	total: 13.7s	remaining: 4m 13s
77:	learn: 0.4514314	total: 13.8s	remaining: 4m 12s
78:	learn: 0.4510512	total: 14s	remaining: 4m 11s
79:	learn: 0.4504289	total: 14.2s	remaining: 4m 11s
80:	learn: 0.4500258	total: 14.3s	remaining: 4m 10s
81:	learn: 0.4495999	total: 14.5s	remaining: 4m 10s
82:	learn: 0.4492354	total: 14.7s	remaining: 4m 10s
83:	learn: 0.4488382	total: 14.9s	remaining: 4m 10s
84:	learn: 0.4485437	total: 15s	remaining: 4m 10s
85:	learn: 0.4483555	total: 15.2s	remaining: 4m 9s
86:	learn: 0.4480841	total: 15.3s	remaining: 4m 8s
87:	learn: 0.4476697	total: 15.5s	remaining: 4m 8s
88:	learn: 0.4474014	total: 15.7s	remaining: 4m 8s
89:	learn: 0.4471210	total: 15.8s	remaining: 4m 7s
90:	learn: 0.4468090	total: 15.9s	remaining: 4m 6s
91:	learn: 0.4465511	total: 16.1s	remaining: 4m 5s
92:	learn: 0.4461830	total: 16.2s	remaining: 4m 4s
93:	learn: 0.4459114	total: 16.3s	remaining: 4m 3s
94:	learn: 0.4453665	total: 16.5s	remaining: 4m 3s
95:	learn: 0.4451112	total: 16.6s	remaining: 4m 2s
96:	learn: 0.4446841	total: 16.7s	remaining: 4m 2s
97:	learn: 0.4440914	total: 16.9s	remaining: 4m 1s
98:	learn: 0.4438212	total: 17s	remaining: 4m
99:	learn: 0.4435295	total: 17.1s	remaining: 3m 59s
100:	learn: 0.4432962	total: 17.2s	remaining: 3m 58s
101:	learn: 0.4426627	total: 17.3s	remaining: 3m 57s
102:	learn: 0.4424432	total: 17.5s	remaining: 3m 57s
103:	learn: 0.4422708	total: 17.6s	remaining: 3m 56s
104:	learn: 0.4420682	total: 17.7s	remaining: 3m 55s
105:	learn: 0.4414525	total: 17.9s	remaining: 3m 55s
106:	learn: 0.4409231	total: 18.1s	remaining: 3m 55s
107:	learn: 0.4403554	total: 18.3s	remaining: 3m 55s
108:	learn: 0.4397406	total: 18.4s	remaining: 3m 55s
109:	learn: 0.4392713	total: 18.6s	remaining: 3m 54s
110:	learn: 0.4390503	total: 18.8s	remaining: 3m 54s
111:	learn: 0.4387495	total: 19s	remaining: 3m 54s
112:	learn: 0.4384526	total: 19.1s	remaining: 3m 54s
113:	learn: 0.4382712	total: 19.3s	remaining: 3m 54s
114:	learn: 0.4381076	total: 19.4s	remaining: 3m 53s
115:	learn: 0.4378713	total: 19.6s	remaining: 3m 53s
116:	learn: 0.4376512	total: 19.7s	remaining: 3m 53s
117:	learn: 0.4374948	total: 19.8s	remaining: 3m 52s
118:	learn: 0.4372604	total: 20s	remaining: 3m 51s
119:	learn: 0.4371041	total: 20.1s	remaining: 3m 51s
120:	learn: 0.4368620	total: 20.3s	remaining: 3m 50s
121:	learn: 0.4365358	total: 20.4s	remaining: 3m 50s
122:	learn: 0.4363281	total: 20.5s	remaining: 3m 50s
123:	learn: 0.4360847	total: 20.7s	remaining: 3m 49s
124:	learn: 0.4359130	total: 20.8s	remaining: 3m 49s
125:	learn: 0.4354796	total: 21s	remaining: 3m 48s
126:	learn: 0.4352692	total: 21.1s	remaining: 3m 48s
127:	learn: 0.4350797	total: 21.3s	remaining: 3m 47s
128:	learn: 0.4349256	total: 21.4s	remaining: 3m 47s
129:	learn: 0.4347981	total: 21.6s	remaining: 3m 47s
130:	learn: 0.4345545	total: 21.7s	remaining: 3m 46s
131:	learn: 0.4343529	total: 21.8s	remaining: 3m 46s
132:	learn: 0.4338919	total: 21.9s	remaining: 3m 45s
133:	learn: 0.4336986	total: 22s	remaining: 3m 44s
134:	learn: 0.4335158	total: 22.2s	remaining: 3m 44s
135:	learn: 0.4332763	total: 22.3s	remaining: 3m 44s
136:	learn: 0.4330899	total: 22.5s	remaining: 3m 43s
137:	learn: 0.4329010	total: 22.6s	remaining: 3m 42s
138:	learn: 0.4327303	total: 22.7s	remaining: 3m 42s
139:	learn: 0.4324747	total: 22.8s	remaining: 3m 41s
140:	learn: 0.4321264	total: 22.9s	remaining: 3m 40s
141:	learn: 0.4317630	total: 23.1s	remaining: 3m 40s
142:	learn: 0.4315454	total: 23.2s	remaining: 3m 40s
143:	learn: 0.4313632	total: 23.3s	remaining: 3m 39s
144:	learn: 0.4311079	total: 23.5s	remaining: 3m 39s
145:	learn: 0.4306293	total: 23.6s	remaining: 3m 38s
146:	learn: 0.4303977	total: 23.7s	remaining: 3m 38s
147:	learn: 0.4302217	total: 23.8s	remaining: 3m 37s
148:	learn: 0.4298705	total: 23.9s	remaining: 3m 36s
149:	learn: 0.4297155	total: 24s	remaining: 3m 36s
150:	learn: 0.4295453	total: 24.1s	remaining: 3m 35s
151:	learn: 0.4293379	total: 24.3s	remaining: 3m 35s
152:	learn: 0.4291598	total: 24.4s	remaining: 3m 34s
153:	learn: 0.4290174	total: 24.5s	remaining: 3m 34s
154:	learn: 0.4288495	total: 24.7s	remaining: 3m 33s
155:	learn: 0.4286854	total: 24.8s	remaining: 3m 33s
156:	learn: 0.4285175	total: 24.9s	remaining: 3m 32s
157:	learn: 0.4283705	total: 25s	remaining: 3m 32s
158:	learn: 0.4280544	total: 25.1s	remaining: 3m 31s
159:	learn: 0.4276702	total: 25.3s	remaining: 3m 31s
160:	learn: 0.4272829	total: 25.4s	remaining: 3m 31s
161:	learn: 0.4271417	total: 25.5s	remaining: 3m 30s
162:	learn: 0.4268727	total: 25.7s	remaining: 3m 30s
163:	learn: 0.4267120	total: 25.8s	remaining: 3m 29s
164:	learn: 0.4265385	total: 25.9s	remaining: 3m 29s
165:	learn: 0.4262944	total: 26s	remaining: 3m 29s
166:	learn: 0.4261242	total: 26.2s	remaining: 3m 28s
167:	learn: 0.4259579	total: 26.3s	remaining: 3m 28s
168:	learn: 0.4257445	total: 26.5s	remaining: 3m 28s
169:	learn: 0.4255745	total: 26.6s	remaining: 3m 28s
170:	learn: 0.4254542	total: 26.7s	remaining: 3m 27s
171:	learn: 0.4252638	total: 26.8s	remaining: 3m 27s
172:	learn: 0.4250217	total: 26.9s	remaining: 3m 26s
173:	learn: 0.4248191	total: 27.1s	remaining: 3m 26s
174:	learn: 0.4245686	total: 27.2s	remaining: 3m 26s
175:	learn: 0.4242933	total: 27.3s	remaining: 3m 25s
176:	learn: 0.4241383	total: 27.5s	remaining: 3m 25s
177:	learn: 0.4239878	total: 27.6s	remaining: 3m 25s
178:	learn: 0.4238389	total: 27.7s	remaining: 3m 24s
179:	learn: 0.4237020	total: 27.8s	remaining: 3m 24s
180:	learn: 0.4235660	total: 28s	remaining: 3m 23s
181:	learn: 0.4234399	total: 28.1s	remaining: 3m 23s
182:	learn: 0.4233301	total: 28.3s	remaining: 3m 23s
183:	learn: 0.4231566	total: 28.4s	remaining: 3m 23s
184:	learn: 0.4230037	total: 28.6s	remaining: 3m 23s
185:	learn: 0.4228927	total: 28.7s	remaining: 3m 22s
186:	learn: 0.4227508	total: 28.8s	remaining: 3m 22s
187:	learn: 0.4226021	total: 28.9s	remaining: 3m 21s
188:	learn: 0.4224608	total: 29.1s	remaining: 3m 21s
189:	learn: 0.4223507	total: 29.2s	remaining: 3m 21s
190:	learn: 0.4221859	total: 29.3s	remaining: 3m 20s
191:	learn: 0.4219437	total: 29.4s	remaining: 3m 20s
192:	learn: 0.4218010	total: 29.5s	remaining: 3m 20s
193:	learn: 0.4216345	total: 29.7s	remaining: 3m 19s
194:	learn: 0.4215043	total: 29.8s	remaining: 3m 19s
195:	learn: 0.4214055	total: 29.9s	remaining: 3m 18s
196:	learn: 0.4212346	total: 30s	remaining: 3m 18s
197:	learn: 0.4210324	total: 30.1s	remaining: 3m 18s
198:	learn: 0.4209072	total: 30.2s	remaining: 3m 17s
199:	learn: 0.4206930	total: 30.4s	remaining: 3m 17s
200:	learn: 0.4205368	total: 30.5s	remaining: 3m 16s
201:	learn: 0.4203194	total: 30.6s	remaining: 3m 16s
202:	learn: 0.4201518	total: 30.7s	remaining: 3m 16s
203:	learn: 0.4199790	total: 30.9s	remaining: 3m 16s
204:	learn: 0.4198538	total: 31s	remaining: 3m 15s
205:	learn: 0.4197381	total: 31.1s	remaining: 3m 15s
206:	learn: 0.4195910	total: 31.3s	remaining: 3m 15s
207:	learn: 0.4194753	total: 31.4s	remaining: 3m 15s
208:	learn: 0.4193641	total: 31.5s	remaining: 3m 14s
209:	learn: 0.4192307	total: 31.6s	remaining: 3m 14s
210:	learn: 0.4191265	total: 31.8s	remaining: 3m 14s
211:	learn: 0.4189671	total: 31.9s	remaining: 3m 13s
212:	learn: 0.4188487	total: 32s	remaining: 3m 13s
213:	learn: 0.4187335	total: 32.1s	remaining: 3m 13s
214:	learn: 0.4185614	total: 32.2s	remaining: 3m 12s
215:	learn: 0.4183632	total: 32.4s	remaining: 3m 12s
216:	learn: 0.4181933	total: 32.5s	remaining: 3m 12s
217:	learn: 0.4180930	total: 32.7s	remaining: 3m 12s
218:	learn: 0.4180029	total: 32.8s	remaining: 3m 11s
219:	learn: 0.4179095	total: 32.9s	remaining: 3m 11s
220:	learn: 0.4177894	total: 33s	remaining: 3m 10s
221:	learn: 0.4176699	total: 33.1s	remaining: 3m 10s
222:	learn: 0.4175295	total: 33.2s	remaining: 3m 10s
223:	learn: 0.4174234	total: 33.4s	remaining: 3m 9s
224:	learn: 0.4172908	total: 33.5s	remaining: 3m 9s
225:	learn: 0.4171686	total: 33.6s	remaining: 3m 9s
226:	learn: 0.4170633	total: 33.8s	remaining: 3m 9s
227:	learn: 0.4169650	total: 33.9s	remaining: 3m 9s
228:	learn: 0.4168423	total: 34.1s	remaining: 3m 9s
229:	learn: 0.4167147	total: 34.3s	remaining: 3m 9s
230:	learn: 0.4165802	total: 34.4s	remaining: 3m 9s
231:	learn: 0.4164305	total: 34.6s	remaining: 3m 8s
232:	learn: 0.4163451	total: 34.7s	remaining: 3m 8s
233:	learn: 0.4162467	total: 34.9s	remaining: 3m 8s
234:	learn: 0.4160811	total: 35s	remaining: 3m 8s
235:	learn: 0.4159739	total: 35.2s	remaining: 3m 8s
236:	learn: 0.4158911	total: 35.3s	remaining: 3m 8s
237:	learn: 0.4157820	total: 35.5s	remaining: 3m 8s
238:	learn: 0.4156744	total: 35.6s	remaining: 3m 8s
239:	learn: 0.4155717	total: 35.8s	remaining: 3m 7s
240:	learn: 0.4154835	total: 35.9s	remaining: 3m 7s
241:	learn: 0.4153700	total: 36s	remaining: 3m 7s
242:	learn: 0.4152573	total: 36.1s	remaining: 3m 6s
243:	learn: 0.4151416	total: 36.2s	remaining: 3m 6s
244:	learn: 0.4150580	total: 36.3s	remaining: 3m 6s
245:	learn: 0.4149630	total: 36.5s	remaining: 3m 5s
246:	learn: 0.4147419	total: 36.6s	remaining: 3m 5s
247:	learn: 0.4146342	total: 36.8s	remaining: 3m 5s
248:	learn: 0.4145247	total: 36.9s	remaining: 3m 5s
249:	learn: 0.4144287	total: 37s	remaining: 3m 5s
250:	learn: 0.4143367	total: 37.1s	remaining: 3m 4s
251:	learn: 0.4142780	total: 37.2s	remaining: 3m 4s
252:	learn: 0.4141778	total: 37.4s	remaining: 3m 4s
253:	learn: 0.4140878	total: 37.5s	remaining: 3m 4s
254:	learn: 0.4139783	total: 37.6s	remaining: 3m 3s
255:	learn: 0.4139071	total: 37.8s	remaining: 3m 3s
256:	learn: 0.4137988	total: 37.9s	remaining: 3m 3s
257:	learn: 0.4136708	total: 38.1s	remaining: 3m 3s
258:	learn: 0.4135791	total: 38.2s	remaining: 3m 2s
259:	learn: 0.4134508	total: 38.3s	remaining: 3m 2s
260:	learn: 0.4133515	total: 38.4s	remaining: 3m 2s
261:	learn: 0.4132693	total: 38.5s	remaining: 3m 2s
262:	learn: 0.4131324	total: 38.7s	remaining: 3m 1s
263:	learn: 0.4130433	total: 38.8s	remaining: 3m 1s
264:	learn: 0.4129616	total: 38.9s	remaining: 3m 1s
265:	learn: 0.4128696	total: 39.1s	remaining: 3m 1s
266:	learn: 0.4127906	total: 39.1s	remaining: 3m
267:	learn: 0.4126788	total: 39.3s	remaining: 3m
268:	learn: 0.4125614	total: 39.4s	remaining: 3m
269:	learn: 0.4124559	total: 39.5s	remaining: 2m 59s
270:	learn: 0.4123647	total: 39.6s	remaining: 2m 59s
271:	learn: 0.4122665	total: 39.7s	remaining: 2m 59s
272:	learn: 0.4121809	total: 39.8s	remaining: 2m 58s
273:	learn: 0.4120562	total: 39.9s	remaining: 2m 58s
274:	learn: 0.4119468	total: 40s	remaining: 2m 58s
275:	learn: 0.4118434	total: 40.2s	remaining: 2m 58s
276:	learn: 0.4117427	total: 40.3s	remaining: 2m 57s
277:	learn: 0.4116233	total: 40.4s	remaining: 2m 57s
278:	learn: 0.4115538	total: 40.5s	remaining: 2m 57s
279:	learn: 0.4114598	total: 40.6s	remaining: 2m 57s
280:	learn: 0.4113658	total: 40.8s	remaining: 2m 56s
281:	learn: 0.4112827	total: 40.9s	remaining: 2m 56s
282:	learn: 0.4111415	total: 41s	remaining: 2m 56s
283:	learn: 0.4110267	total: 41.1s	remaining: 2m 56s
284:	learn: 0.4109115	total: 41.3s	remaining: 2m 55s
285:	learn: 0.4108055	total: 41.4s	remaining: 2m 55s
286:	learn: 0.4107300	total: 41.5s	remaining: 2m 55s
287:	learn: 0.4106212	total: 41.6s	remaining: 2m 55s
288:	learn: 0.4105268	total: 41.7s	remaining: 2m 54s
289:	learn: 0.4104587	total: 41.9s	remaining: 2m 54s
290:	learn: 0.4103714	total: 42s	remaining: 2m 54s
291:	learn: 0.4102808	total: 42.1s	remaining: 2m 54s
292:	learn: 0.4101882	total: 42.2s	remaining: 2m 53s
293:	learn: 0.4101188	total: 42.3s	remaining: 2m 53s
294:	learn: 0.4100504	total: 42.4s	remaining: 2m 53s
295:	learn: 0.4099516	total: 42.6s	remaining: 2m 53s
296:	learn: 0.4098897	total: 42.7s	remaining: 2m 52s
297:	learn: 0.4098115	total: 42.8s	remaining: 2m 52s
298:	learn: 0.4097169	total: 43s	remaining: 2m 52s
299:	learn: 0.4096384	total: 43.1s	remaining: 2m 52s
300:	learn: 0.4095703	total: 43.2s	remaining: 2m 52s
301:	learn: 0.4094938	total: 43.3s	remaining: 2m 51s
302:	learn: 0.4094051	total: 43.4s	remaining: 2m 51s
303:	learn: 0.4093298	total: 43.6s	remaining: 2m 51s
304:	learn: 0.4092585	total: 43.7s	remaining: 2m 51s
305:	learn: 0.4091893	total: 43.8s	remaining: 2m 50s
306:	learn: 0.4091027	total: 43.9s	remaining: 2m 50s
307:	learn: 0.4090120	total: 44.1s	remaining: 2m 50s
308:	learn: 0.4089163	total: 44.2s	remaining: 2m 50s
309:	learn: 0.4087967	total: 44.3s	remaining: 2m 50s
310:	learn: 0.4087240	total: 44.4s	remaining: 2m 49s
311:	learn: 0.4086290	total: 44.5s	remaining: 2m 49s
312:	learn: 0.4085105	total: 44.6s	remaining: 2m 49s
313:	learn: 0.4084084	total: 44.8s	remaining: 2m 49s
314:	learn: 0.4083281	total: 44.9s	remaining: 2m 48s
315:	learn: 0.4082548	total: 45s	remaining: 2m 48s
316:	learn: 0.4081701	total: 45.1s	remaining: 2m 48s
317:	learn: 0.4080834	total: 45.3s	remaining: 2m 48s
318:	learn: 0.4080161	total: 45.4s	remaining: 2m 48s
319:	learn: 0.4079224	total: 45.5s	remaining: 2m 47s
320:	learn: 0.4078398	total: 45.6s	remaining: 2m 47s
321:	learn: 0.4077585	total: 45.7s	remaining: 2m 47s
322:	learn: 0.4076920	total: 45.9s	remaining: 2m 47s
323:	learn: 0.4076169	total: 46s	remaining: 2m 47s
324:	learn: 0.4075533	total: 46.2s	remaining: 2m 46s
325:	learn: 0.4074743	total: 46.3s	remaining: 2m 46s
326:	learn: 0.4073945	total: 46.4s	remaining: 2m 46s
327:	learn: 0.4072975	total: 46.5s	remaining: 2m 46s
328:	learn: 0.4072449	total: 46.6s	remaining: 2m 45s
329:	learn: 0.4071576	total: 46.8s	remaining: 2m 45s
330:	learn: 0.4070725	total: 46.9s	remaining: 2m 45s
331:	learn: 0.4069988	total: 47s	remaining: 2m 45s
332:	learn: 0.4069070	total: 47.1s	remaining: 2m 45s
333:	learn: 0.4068473	total: 47.2s	remaining: 2m 44s
334:	learn: 0.4067860	total: 47.4s	remaining: 2m 44s
335:	learn: 0.4067039	total: 47.5s	remaining: 2m 44s
336:	learn: 0.4066202	total: 47.6s	remaining: 2m 44s
337:	learn: 0.4065462	total: 47.7s	remaining: 2m 44s
338:	learn: 0.4064609	total: 47.8s	remaining: 2m 43s
339:	learn: 0.4063894	total: 47.9s	remaining: 2m 43s
340:	learn: 0.4063353	total: 48s	remaining: 2m 43s
341:	learn: 0.4062506	total: 48.2s	remaining: 2m 43s
342:	learn: 0.4061396	total: 48.3s	remaining: 2m 42s
343:	learn: 0.4060467	total: 48.4s	remaining: 2m 42s
344:	learn: 0.4059969	total: 48.5s	remaining: 2m 42s
345:	learn: 0.4059116	total: 48.6s	remaining: 2m 42s
346:	learn: 0.4058355	total: 48.7s	remaining: 2m 41s
347:	learn: 0.4057598	total: 48.8s	remaining: 2m 41s
348:	learn: 0.4056788	total: 49s	remaining: 2m 41s
349:	learn: 0.4056070	total: 49.1s	remaining: 2m 41s
350:	learn: 0.4055134	total: 49.2s	remaining: 2m 41s
351:	learn: 0.4054214	total: 49.4s	remaining: 2m 41s
352:	learn: 0.4053231	total: 49.5s	remaining: 2m 40s
353:	learn: 0.4052485	total: 49.7s	remaining: 2m 40s
354:	learn: 0.4051725	total: 49.8s	remaining: 2m 40s
355:	learn: 0.4051173	total: 49.9s	remaining: 2m 40s
356:	learn: 0.4050181	total: 50.1s	remaining: 2m 40s
357:	learn: 0.4049576	total: 50.2s	remaining: 2m 40s
358:	learn: 0.4048947	total: 50.4s	remaining: 2m 40s
359:	learn: 0.4048213	total: 50.5s	remaining: 2m 39s
360:	learn: 0.4047467	total: 50.6s	remaining: 2m 39s
361:	learn: 0.4046607	total: 50.8s	remaining: 2m 39s
362:	learn: 0.4045940	total: 50.9s	remaining: 2m 39s
363:	learn: 0.4044992	total: 51.1s	remaining: 2m 39s
364:	learn: 0.4044115	total: 51.2s	remaining: 2m 39s
365:	learn: 0.4043328	total: 51.3s	remaining: 2m 38s
366:	learn: 0.4042476	total: 51.4s	remaining: 2m 38s
367:	learn: 0.4041745	total: 51.5s	remaining: 2m 38s
368:	learn: 0.4040970	total: 51.6s	remaining: 2m 38s
369:	learn: 0.4040293	total: 51.7s	remaining: 2m 37s
370:	learn: 0.4039593	total: 51.8s	remaining: 2m 37s
371:	learn: 0.4038816	total: 52s	remaining: 2m 37s
372:	learn: 0.4038278	total: 52.1s	remaining: 2m 37s
373:	learn: 0.4037759	total: 52.2s	remaining: 2m 37s
374:	learn: 0.4037034	total: 52.3s	remaining: 2m 36s
375:	learn: 0.4036021	total: 52.4s	remaining: 2m 36s
376:	learn: 0.4035164	total: 52.6s	remaining: 2m 36s
377:	learn: 0.4034367	total: 52.7s	remaining: 2m 36s
378:	learn: 0.4033626	total: 52.8s	remaining: 2m 36s
379:	learn: 0.4032920	total: 52.9s	remaining: 2m 35s
380:	learn: 0.4032345	total: 53s	remaining: 2m 35s
381:	learn: 0.4031842	total: 53.1s	remaining: 2m 35s
382:	learn: 0.4031103	total: 53.3s	remaining: 2m 35s
383:	learn: 0.4030435	total: 53.4s	remaining: 2m 35s
384:	learn: 0.4029774	total: 53.5s	remaining: 2m 35s
385:	learn: 0.4029199	total: 53.6s	remaining: 2m 34s
386:	learn: 0.4028477	total: 53.8s	remaining: 2m 34s
387:	learn: 0.4027816	total: 53.9s	remaining: 2m 34s
388:	learn: 0.4027216	total: 54s	remaining: 2m 34s
389:	learn: 0.4026636	total: 54.1s	remaining: 2m 34s
390:	learn: 0.4025814	total: 54.3s	remaining: 2m 34s
391:	learn: 0.4024954	total: 54.5s	remaining: 2m 34s
392:	learn: 0.4024155	total: 54.6s	remaining: 2m 33s
393:	learn: 0.4023596	total: 54.7s	remaining: 2m 33s
394:	learn: 0.4023046	total: 54.8s	remaining: 2m 33s
395:	learn: 0.4022664	total: 54.9s	remaining: 2m 33s
396:	learn: 0.4022069	total: 55s	remaining: 2m 32s
397:	learn: 0.4021497	total: 55.1s	remaining: 2m 32s
398:	learn: 0.4020791	total: 55.3s	remaining: 2m 32s
399:	learn: 0.4020160	total: 55.4s	remaining: 2m 32s
400:	learn: 0.4019454	total: 55.5s	remaining: 2m 32s
401:	learn: 0.4018606	total: 55.6s	remaining: 2m 31s
402:	learn: 0.4017978	total: 55.8s	remaining: 2m 31s
403:	learn: 0.4017438	total: 55.9s	remaining: 2m 31s
404:	learn: 0.4016781	total: 56.1s	remaining: 2m 31s
405:	learn: 0.4016060	total: 56.2s	remaining: 2m 31s
406:	learn: 0.4015233	total: 56.4s	remaining: 2m 31s
407:	learn: 0.4014577	total: 56.5s	remaining: 2m 31s
408:	learn: 0.4013667	total: 56.7s	remaining: 2m 31s
409:	learn: 0.4013022	total: 56.8s	remaining: 2m 31s
410:	learn: 0.4012400	total: 57s	remaining: 2m 30s
411:	learn: 0.4011681	total: 57.1s	remaining: 2m 30s
412:	learn: 0.4011145	total: 57.3s	remaining: 2m 30s
413:	learn: 0.4010444	total: 57.4s	remaining: 2m 30s
414:	learn: 0.4009774	total: 57.5s	remaining: 2m 30s
415:	learn: 0.4009316	total: 57.7s	remaining: 2m 30s
416:	learn: 0.4008546	total: 57.8s	remaining: 2m 30s
417:	learn: 0.4007915	total: 57.9s	remaining: 2m 29s
418:	learn: 0.4007424	total: 58.1s	remaining: 2m 29s
419:	learn: 0.4006909	total: 58.2s	remaining: 2m 29s
420:	learn: 0.4006317	total: 58.3s	remaining: 2m 29s
421:	learn: 0.4005677	total: 58.5s	remaining: 2m 29s
422:	learn: 0.4005169	total: 58.6s	remaining: 2m 29s
423:	learn: 0.4004522	total: 58.7s	remaining: 2m 28s
424:	learn: 0.4003759	total: 58.8s	remaining: 2m 28s
425:	learn: 0.4003025	total: 59s	remaining: 2m 28s
426:	learn: 0.4002242	total: 59.1s	remaining: 2m 28s
427:	learn: 0.4001756	total: 59.2s	remaining: 2m 28s
428:	learn: 0.4001144	total: 59.3s	remaining: 2m 28s
429:	learn: 0.4000623	total: 59.6s	remaining: 2m 28s
430:	learn: 0.4000231	total: 59.9s	remaining: 2m 28s
431:	learn: 0.3999515	total: 1m	remaining: 2m 28s
432:	learn: 0.3998965	total: 1m	remaining: 2m 28s
433:	learn: 0.3998465	total: 1m	remaining: 2m 28s
434:	learn: 0.3997707	total: 1m	remaining: 2m 28s
435:	learn: 0.3996908	total: 1m	remaining: 2m 28s
436:	learn: 0.3996154	total: 1m	remaining: 2m 28s
437:	learn: 0.3995556	total: 1m 1s	remaining: 2m 27s
438:	learn: 0.3995088	total: 1m 1s	remaining: 2m 27s
439:	learn: 0.3994497	total: 1m 1s	remaining: 2m 27s
440:	learn: 0.3993561	total: 1m 1s	remaining: 2m 27s
441:	learn: 0.3992819	total: 1m 1s	remaining: 2m 27s
442:	learn: 0.3992113	total: 1m 1s	remaining: 2m 27s
443:	learn: 0.3991475	total: 1m 1s	remaining: 2m 27s
444:	learn: 0.3990575	total: 1m 2s	remaining: 2m 27s
445:	learn: 0.3989809	total: 1m 2s	remaining: 2m 27s
446:	learn: 0.3989239	total: 1m 2s	remaining: 2m 26s
447:	learn: 0.3988423	total: 1m 2s	remaining: 2m 26s
448:	learn: 0.3987733	total: 1m 2s	remaining: 2m 26s
449:	learn: 0.3987060	total: 1m 2s	remaining: 2m 26s
450:	learn: 0.3986266	total: 1m 2s	remaining: 2m 26s
451:	learn: 0.3985655	total: 1m 3s	remaining: 2m 26s
452:	learn: 0.3985031	total: 1m 3s	remaining: 2m 26s
453:	learn: 0.3984407	total: 1m 3s	remaining: 2m 25s
454:	learn: 0.3983777	total: 1m 3s	remaining: 2m 25s
455:	learn: 0.3983132	total: 1m 3s	remaining: 2m 25s
456:	learn: 0.3982277	total: 1m 3s	remaining: 2m 26s
457:	learn: 0.3981757	total: 1m 4s	remaining: 2m 25s
458:	learn: 0.3981007	total: 1m 4s	remaining: 2m 25s
459:	learn: 0.3980368	total: 1m 4s	remaining: 2m 25s
460:	learn: 0.3979535	total: 1m 4s	remaining: 2m 25s
461:	learn: 0.3978906	total: 1m 4s	remaining: 2m 25s
462:	learn: 0.3978129	total: 1m 5s	remaining: 2m 25s
463:	learn: 0.3977337	total: 1m 5s	remaining: 2m 25s
464:	learn: 0.3976748	total: 1m 5s	remaining: 2m 25s
465:	learn: 0.3976085	total: 1m 5s	remaining: 2m 25s
466:	learn: 0.3975384	total: 1m 5s	remaining: 2m 25s
467:	learn: 0.3974647	total: 1m 6s	remaining: 2m 25s
468:	learn: 0.3974143	total: 1m 6s	remaining: 2m 25s
469:	learn: 0.3973523	total: 1m 6s	remaining: 2m 25s
470:	learn: 0.3972767	total: 1m 6s	remaining: 2m 25s
471:	learn: 0.3972058	total: 1m 6s	remaining: 2m 25s
472:	learn: 0.3971352	total: 1m 6s	remaining: 2m 25s
473:	learn: 0.3970757	total: 1m 7s	remaining: 2m 25s
474:	learn: 0.3970242	total: 1m 7s	remaining: 2m 25s
475:	learn: 0.3969724	total: 1m 7s	remaining: 2m 25s
476:	learn: 0.3968133	total: 1m 7s	remaining: 2m 24s
477:	learn: 0.3967645	total: 1m 7s	remaining: 2m 24s
478:	learn: 0.3967168	total: 1m 7s	remaining: 2m 24s
479:	learn: 0.3966544	total: 1m 8s	remaining: 2m 24s
480:	learn: 0.3965942	total: 1m 8s	remaining: 2m 24s
481:	learn: 0.3965477	total: 1m 8s	remaining: 2m 24s
482:	learn: 0.3964994	total: 1m 8s	remaining: 2m 24s
483:	learn: 0.3964573	total: 1m 8s	remaining: 2m 23s
484:	learn: 0.3964095	total: 1m 8s	remaining: 2m 23s
485:	learn: 0.3963579	total: 1m 8s	remaining: 2m 23s
486:	learn: 0.3962910	total: 1m 9s	remaining: 2m 23s
487:	learn: 0.3962304	total: 1m 9s	remaining: 2m 23s
488:	learn: 0.3961649	total: 1m 9s	remaining: 2m 23s
489:	learn: 0.3960969	total: 1m 9s	remaining: 2m 23s
490:	learn: 0.3960433	total: 1m 9s	remaining: 2m 23s
491:	learn: 0.3959893	total: 1m 9s	remaining: 2m 23s
492:	learn: 0.3959320	total: 1m 9s	remaining: 2m 22s
493:	learn: 0.3958686	total: 1m 10s	remaining: 2m 22s
494:	learn: 0.3958142	total: 1m 10s	remaining: 2m 22s
495:	learn: 0.3957573	total: 1m 10s	remaining: 2m 22s
496:	learn: 0.3956990	total: 1m 10s	remaining: 2m 22s
497:	learn: 0.3956471	total: 1m 10s	remaining: 2m 22s
498:	learn: 0.3955932	total: 1m 10s	remaining: 2m 22s
499:	learn: 0.3955279	total: 1m 11s	remaining: 2m 22s
500:	learn: 0.3954883	total: 1m 11s	remaining: 2m 21s
501:	learn: 0.3954346	total: 1m 11s	remaining: 2m 21s
502:	learn: 0.3953215	total: 1m 11s	remaining: 2m 21s
503:	learn: 0.3952583	total: 1m 11s	remaining: 2m 21s
504:	learn: 0.3951912	total: 1m 11s	remaining: 2m 21s
505:	learn: 0.3951287	total: 1m 11s	remaining: 2m 21s
506:	learn: 0.3950670	total: 1m 12s	remaining: 2m 21s
507:	learn: 0.3949979	total: 1m 12s	remaining: 2m 21s
508:	learn: 0.3949370	total: 1m 12s	remaining: 2m 20s
509:	learn: 0.3948999	total: 1m 12s	remaining: 2m 20s
510:	learn: 0.3948557	total: 1m 12s	remaining: 2m 20s
511:	learn: 0.3948026	total: 1m 12s	remaining: 2m 20s
512:	learn: 0.3947569	total: 1m 13s	remaining: 2m 20s
513:	learn: 0.3946920	total: 1m 13s	remaining: 2m 20s
514:	learn: 0.3946302	total: 1m 13s	remaining: 2m 20s
515:	learn: 0.3945462	total: 1m 13s	remaining: 2m 20s
516:	learn: 0.3944678	total: 1m 13s	remaining: 2m 20s
517:	learn: 0.3944087	total: 1m 13s	remaining: 2m 20s
518:	learn: 0.3943547	total: 1m 14s	remaining: 2m 20s
519:	learn: 0.3943020	total: 1m 14s	remaining: 2m 19s
520:	learn: 0.3942627	total: 1m 14s	remaining: 2m 19s
521:	learn: 0.3942176	total: 1m 14s	remaining: 2m 19s
522:	learn: 0.3941520	total: 1m 14s	remaining: 2m 19s
523:	learn: 0.3940831	total: 1m 14s	remaining: 2m 19s
524:	learn: 0.3940185	total: 1m 14s	remaining: 2m 19s
525:	learn: 0.3939805	total: 1m 15s	remaining: 2m 19s
526:	learn: 0.3939252	total: 1m 15s	remaining: 2m 18s
527:	learn: 0.3938663	total: 1m 15s	remaining: 2m 18s
528:	learn: 0.3938087	total: 1m 15s	remaining: 2m 18s
529:	learn: 0.3937683	total: 1m 15s	remaining: 2m 18s
530:	learn: 0.3937066	total: 1m 15s	remaining: 2m 18s
531:	learn: 0.3936625	total: 1m 16s	remaining: 2m 18s
532:	learn: 0.3936190	total: 1m 16s	remaining: 2m 18s
533:	learn: 0.3935582	total: 1m 16s	remaining: 2m 18s
534:	learn: 0.3935059	total: 1m 16s	remaining: 2m 17s
535:	learn: 0.3934613	total: 1m 16s	remaining: 2m 17s
536:	learn: 0.3934064	total: 1m 16s	remaining: 2m 17s
537:	learn: 0.3933387	total: 1m 16s	remaining: 2m 17s
538:	learn: 0.3932732	total: 1m 16s	remaining: 2m 17s
539:	learn: 0.3932143	total: 1m 17s	remaining: 2m 17s
540:	learn: 0.3931579	total: 1m 17s	remaining: 2m 16s
541:	learn: 0.3930965	total: 1m 17s	remaining: 2m 16s
542:	learn: 0.3930418	total: 1m 17s	remaining: 2m 16s
543:	learn: 0.3929747	total: 1m 17s	remaining: 2m 16s
544:	learn: 0.3929127	total: 1m 17s	remaining: 2m 16s
545:	learn: 0.3928801	total: 1m 18s	remaining: 2m 16s
546:	learn: 0.3928214	total: 1m 18s	remaining: 2m 16s
547:	learn: 0.3927590	total: 1m 18s	remaining: 2m 16s
548:	learn: 0.3927067	total: 1m 18s	remaining: 2m 16s
549:	learn: 0.3926600	total: 1m 18s	remaining: 2m 15s
550:	learn: 0.3926019	total: 1m 18s	remaining: 2m 15s
551:	learn: 0.3925449	total: 1m 18s	remaining: 2m 15s
552:	learn: 0.3924816	total: 1m 19s	remaining: 2m 15s
553:	learn: 0.3924310	total: 1m 19s	remaining: 2m 15s
554:	learn: 0.3923848	total: 1m 19s	remaining: 2m 15s
555:	learn: 0.3923314	total: 1m 19s	remaining: 2m 14s
556:	learn: 0.3922769	total: 1m 19s	remaining: 2m 14s
557:	learn: 0.3922070	total: 1m 19s	remaining: 2m 14s
558:	learn: 0.3921678	total: 1m 19s	remaining: 2m 14s
559:	learn: 0.3921193	total: 1m 20s	remaining: 2m 14s
560:	learn: 0.3920586	total: 1m 20s	remaining: 2m 14s
561:	learn: 0.3919956	total: 1m 20s	remaining: 2m 13s
562:	learn: 0.3919579	total: 1m 20s	remaining: 2m 13s
563:	learn: 0.3919107	total: 1m 20s	remaining: 2m 13s
564:	learn: 0.3918394	total: 1m 20s	remaining: 2m 13s
565:	learn: 0.3917737	total: 1m 20s	remaining: 2m 13s
566:	learn: 0.3917181	total: 1m 21s	remaining: 2m 13s
567:	learn: 0.3916618	total: 1m 21s	remaining: 2m 13s
568:	learn: 0.3915944	total: 1m 21s	remaining: 2m 13s
569:	learn: 0.3915493	total: 1m 21s	remaining: 2m 13s
570:	learn: 0.3914897	total: 1m 21s	remaining: 2m 12s
571:	learn: 0.3914337	total: 1m 21s	remaining: 2m 12s
572:	learn: 0.3913636	total: 1m 22s	remaining: 2m 12s
573:	learn: 0.3913199	total: 1m 22s	remaining: 2m 12s
574:	learn: 0.3912686	total: 1m 22s	remaining: 2m 12s
575:	learn: 0.3912141	total: 1m 22s	remaining: 2m 12s
576:	learn: 0.3911607	total: 1m 22s	remaining: 2m 12s
577:	learn: 0.3910881	total: 1m 22s	remaining: 2m 12s
578:	learn: 0.3910335	total: 1m 22s	remaining: 2m 11s
579:	learn: 0.3909721	total: 1m 23s	remaining: 2m 11s
580:	learn: 0.3909015	total: 1m 23s	remaining: 2m 11s
581:	learn: 0.3908456	total: 1m 23s	remaining: 2m 11s
582:	learn: 0.3907805	total: 1m 23s	remaining: 2m 11s
583:	learn: 0.3907155	total: 1m 23s	remaining: 2m 11s
584:	learn: 0.3906665	total: 1m 23s	remaining: 2m 10s
585:	learn: 0.3906153	total: 1m 23s	remaining: 2m 10s
586:	learn: 0.3905860	total: 1m 23s	remaining: 2m 10s
587:	learn: 0.3905317	total: 1m 24s	remaining: 2m 10s
588:	learn: 0.3904497	total: 1m 24s	remaining: 2m 10s
589:	learn: 0.3903903	total: 1m 24s	remaining: 2m 10s
590:	learn: 0.3903516	total: 1m 24s	remaining: 2m 10s
591:	learn: 0.3903054	total: 1m 24s	remaining: 2m 10s
592:	learn: 0.3902515	total: 1m 24s	remaining: 2m 9s
593:	learn: 0.3902066	total: 1m 25s	remaining: 2m 9s
594:	learn: 0.3901570	total: 1m 25s	remaining: 2m 9s
595:	learn: 0.3901007	total: 1m 25s	remaining: 2m 9s
596:	learn: 0.3900667	total: 1m 25s	remaining: 2m 9s
597:	learn: 0.3900085	total: 1m 25s	remaining: 2m 9s
598:	learn: 0.3899644	total: 1m 25s	remaining: 2m 9s
599:	learn: 0.3899037	total: 1m 25s	remaining: 2m 8s
600:	learn: 0.3898389	total: 1m 26s	remaining: 2m 8s
601:	learn: 0.3897796	total: 1m 26s	remaining: 2m 8s
602:	learn: 0.3897289	total: 1m 26s	remaining: 2m 8s
603:	learn: 0.3896681	total: 1m 26s	remaining: 2m 8s
604:	learn: 0.3896123	total: 1m 26s	remaining: 2m 8s
605:	learn: 0.3895541	total: 1m 26s	remaining: 2m 8s
606:	learn: 0.3894913	total: 1m 27s	remaining: 2m 8s
607:	learn: 0.3894077	total: 1m 27s	remaining: 2m 7s
608:	learn: 0.3893550	total: 1m 27s	remaining: 2m 7s
609:	learn: 0.3892955	total: 1m 27s	remaining: 2m 7s
610:	learn: 0.3892277	total: 1m 27s	remaining: 2m 7s
611:	learn: 0.3891800	total: 1m 27s	remaining: 2m 7s
612:	learn: 0.3891281	total: 1m 28s	remaining: 2m 7s
613:	learn: 0.3890621	total: 1m 28s	remaining: 2m 7s
614:	learn: 0.3890068	total: 1m 28s	remaining: 2m 7s
615:	learn: 0.3889495	total: 1m 28s	remaining: 2m 7s
616:	learn: 0.3888954	total: 1m 28s	remaining: 2m 6s
617:	learn: 0.3888577	total: 1m 28s	remaining: 2m 6s
618:	learn: 0.3888062	total: 1m 28s	remaining: 2m 6s
619:	learn: 0.3887485	total: 1m 29s	remaining: 2m 6s
620:	learn: 0.3887087	total: 1m 29s	remaining: 2m 6s
621:	learn: 0.3886778	total: 1m 29s	remaining: 2m 6s
622:	learn: 0.3886176	total: 1m 29s	remaining: 2m 6s
623:	learn: 0.3885644	total: 1m 29s	remaining: 2m 5s
624:	learn: 0.3885110	total: 1m 29s	remaining: 2m 5s
625:	learn: 0.3884614	total: 1m 29s	remaining: 2m 5s
626:	learn: 0.3884014	total: 1m 30s	remaining: 2m 5s
627:	learn: 0.3883480	total: 1m 30s	remaining: 2m 5s
628:	learn: 0.3882946	total: 1m 30s	remaining: 2m 4s
629:	learn: 0.3881700	total: 1m 30s	remaining: 2m 4s
630:	learn: 0.3881203	total: 1m 30s	remaining: 2m 4s
631:	learn: 0.3880587	total: 1m 30s	remaining: 2m 4s
632:	learn: 0.3879912	total: 1m 30s	remaining: 2m 4s
633:	learn: 0.3879399	total: 1m 30s	remaining: 2m 4s
634:	learn: 0.3878936	total: 1m 31s	remaining: 2m 4s
635:	learn: 0.3878363	total: 1m 31s	remaining: 2m 3s
636:	learn: 0.3877865	total: 1m 31s	remaining: 2m 3s
637:	learn: 0.3877409	total: 1m 31s	remaining: 2m 3s
638:	learn: 0.3876936	total: 1m 31s	remaining: 2m 3s
639:	learn: 0.3876419	total: 1m 31s	remaining: 2m 3s
640:	learn: 0.3875909	total: 1m 31s	remaining: 2m 2s
641:	learn: 0.3875415	total: 1m 31s	remaining: 2m 2s
642:	learn: 0.3874950	total: 1m 32s	remaining: 2m 2s
643:	learn: 0.3874467	total: 1m 32s	remaining: 2m 2s
644:	learn: 0.3873937	total: 1m 32s	remaining: 2m 2s
645:	learn: 0.3873305	total: 1m 32s	remaining: 2m 2s
646:	learn: 0.3872816	total: 1m 32s	remaining: 2m 2s
647:	learn: 0.3872313	total: 1m 32s	remaining: 2m 1s
648:	learn: 0.3871787	total: 1m 32s	remaining: 2m 1s
649:	learn: 0.3871348	total: 1m 32s	remaining: 2m 1s
650:	learn: 0.3870922	total: 1m 33s	remaining: 2m 1s
651:	learn: 0.3870303	total: 1m 33s	remaining: 2m 1s
652:	learn: 0.3869706	total: 1m 33s	remaining: 2m 1s
653:	learn: 0.3869168	total: 1m 33s	remaining: 2m
654:	learn: 0.3868781	total: 1m 33s	remaining: 2m
655:	learn: 0.3868075	total: 1m 33s	remaining: 2m
656:	learn: 0.3867574	total: 1m 33s	remaining: 2m
657:	learn: 0.3867192	total: 1m 34s	remaining: 2m
658:	learn: 0.3866594	total: 1m 34s	remaining: 2m
659:	learn: 0.3866060	total: 1m 34s	remaining: 2m
660:	learn: 0.3865569	total: 1m 34s	remaining: 1m 59s
661:	learn: 0.3865112	total: 1m 34s	remaining: 1m 59s
662:	learn: 0.3864653	total: 1m 34s	remaining: 1m 59s
663:	learn: 0.3864100	total: 1m 34s	remaining: 1m 59s
664:	learn: 0.3863442	total: 1m 34s	remaining: 1m 59s
665:	learn: 0.3862799	total: 1m 35s	remaining: 1m 59s
666:	learn: 0.3862217	total: 1m 35s	remaining: 1m 58s
667:	learn: 0.3861665	total: 1m 35s	remaining: 1m 58s
668:	learn: 0.3861211	total: 1m 35s	remaining: 1m 58s
669:	learn: 0.3860805	total: 1m 35s	remaining: 1m 58s
670:	learn: 0.3860367	total: 1m 35s	remaining: 1m 58s
671:	learn: 0.3859814	total: 1m 35s	remaining: 1m 57s
672:	learn: 0.3859304	total: 1m 35s	remaining: 1m 57s
673:	learn: 0.3858862	total: 1m 35s	remaining: 1m 57s
674:	learn: 0.3858571	total: 1m 36s	remaining: 1m 57s
675:	learn: 0.3858118	total: 1m 36s	remaining: 1m 57s
676:	learn: 0.3857691	total: 1m 36s	remaining: 1m 57s
677:	learn: 0.3857172	total: 1m 36s	remaining: 1m 56s
678:	learn: 0.3856640	total: 1m 36s	remaining: 1m 56s
679:	learn: 0.3856019	total: 1m 36s	remaining: 1m 56s
680:	learn: 0.3855579	total: 1m 36s	remaining: 1m 56s
681:	learn: 0.3854989	total: 1m 36s	remaining: 1m 56s
682:	learn: 0.3854641	total: 1m 37s	remaining: 1m 56s
683:	learn: 0.3854155	total: 1m 37s	remaining: 1m 55s
684:	learn: 0.3853701	total: 1m 37s	remaining: 1m 55s
685:	learn: 0.3853243	total: 1m 37s	remaining: 1m 55s
686:	learn: 0.3852686	total: 1m 37s	remaining: 1m 55s
687:	learn: 0.3852154	total: 1m 37s	remaining: 1m 55s
688:	learn: 0.3851785	total: 1m 37s	remaining: 1m 55s
689:	learn: 0.3851270	total: 1m 38s	remaining: 1m 55s
690:	learn: 0.3850792	total: 1m 38s	remaining: 1m 54s
691:	learn: 0.3850431	total: 1m 38s	remaining: 1m 54s
692:	learn: 0.3849861	total: 1m 38s	remaining: 1m 54s
693:	learn: 0.3849252	total: 1m 38s	remaining: 1m 54s
694:	learn: 0.3848787	total: 1m 38s	remaining: 1m 54s
695:	learn: 0.3848415	total: 1m 38s	remaining: 1m 54s
696:	learn: 0.3847790	total: 1m 38s	remaining: 1m 53s
697:	learn: 0.3847271	total: 1m 39s	remaining: 1m 53s
698:	learn: 0.3846669	total: 1m 39s	remaining: 1m 53s
699:	learn: 0.3846145	total: 1m 39s	remaining: 1m 53s
700:	learn: 0.3845703	total: 1m 39s	remaining: 1m 53s
701:	learn: 0.3844994	total: 1m 39s	remaining: 1m 53s
702:	learn: 0.3844515	total: 1m 39s	remaining: 1m 52s
703:	learn: 0.3843973	total: 1m 39s	remaining: 1m 52s
704:	learn: 0.3843359	total: 1m 39s	remaining: 1m 52s
705:	learn: 0.3842962	total: 1m 40s	remaining: 1m 52s
706:	learn: 0.3842504	total: 1m 40s	remaining: 1m 52s
707:	learn: 0.3841977	total: 1m 40s	remaining: 1m 52s
708:	learn: 0.3841632	total: 1m 40s	remaining: 1m 52s
709:	learn: 0.3841039	total: 1m 40s	remaining: 1m 51s
710:	learn: 0.3840458	total: 1m 40s	remaining: 1m 51s
711:	learn: 0.3839920	total: 1m 40s	remaining: 1m 51s
712:	learn: 0.3839393	total: 1m 40s	remaining: 1m 51s
713:	learn: 0.3838753	total: 1m 41s	remaining: 1m 51s
714:	learn: 0.3838233	total: 1m 41s	remaining: 1m 51s
715:	learn: 0.3837802	total: 1m 41s	remaining: 1m 51s
716:	learn: 0.3837339	total: 1m 41s	remaining: 1m 50s
717:	learn: 0.3836769	total: 1m 41s	remaining: 1m 50s
718:	learn: 0.3836346	total: 1m 41s	remaining: 1m 50s
719:	learn: 0.3835769	total: 1m 42s	remaining: 1m 50s
720:	learn: 0.3835308	total: 1m 42s	remaining: 1m 50s
721:	learn: 0.3834775	total: 1m 42s	remaining: 1m 50s
722:	learn: 0.3834226	total: 1m 42s	remaining: 1m 50s
723:	learn: 0.3833736	total: 1m 42s	remaining: 1m 49s
724:	learn: 0.3833311	total: 1m 42s	remaining: 1m 49s
725:	learn: 0.3832767	total: 1m 42s	remaining: 1m 49s
726:	learn: 0.3832269	total: 1m 42s	remaining: 1m 49s
727:	learn: 0.3831745	total: 1m 43s	remaining: 1m 49s
728:	learn: 0.3831258	total: 1m 43s	remaining: 1m 49s
729:	learn: 0.3830813	total: 1m 43s	remaining: 1m 48s
730:	learn: 0.3830221	total: 1m 43s	remaining: 1m 48s
731:	learn: 0.3829681	total: 1m 43s	remaining: 1m 48s
732:	learn: 0.3829196	total: 1m 43s	remaining: 1m 48s
733:	learn: 0.3828814	total: 1m 43s	remaining: 1m 48s
734:	learn: 0.3828311	total: 1m 43s	remaining: 1m 48s
735:	learn: 0.3827900	total: 1m 44s	remaining: 1m 48s
736:	learn: 0.3827336	total: 1m 44s	remaining: 1m 47s
737:	learn: 0.3826810	total: 1m 44s	remaining: 1m 47s
738:	learn: 0.3826192	total: 1m 44s	remaining: 1m 47s
739:	learn: 0.3825535	total: 1m 44s	remaining: 1m 47s
740:	learn: 0.3825042	total: 1m 44s	remaining: 1m 47s
741:	learn: 0.3824557	total: 1m 44s	remaining: 1m 47s
742:	learn: 0.3824060	total: 1m 45s	remaining: 1m 47s
743:	learn: 0.3823550	total: 1m 45s	remaining: 1m 46s
744:	learn: 0.3823253	total: 1m 45s	remaining: 1m 46s
745:	learn: 0.3822731	total: 1m 45s	remaining: 1m 46s
746:	learn: 0.3822135	total: 1m 45s	remaining: 1m 46s
747:	learn: 0.3821714	total: 1m 45s	remaining: 1m 46s
748:	learn: 0.3821095	total: 1m 45s	remaining: 1m 46s
749:	learn: 0.3820834	total: 1m 46s	remaining: 1m 46s
750:	learn: 0.3820378	total: 1m 46s	remaining: 1m 45s
751:	learn: 0.3819900	total: 1m 46s	remaining: 1m 45s
752:	learn: 0.3819523	total: 1m 46s	remaining: 1m 45s
753:	learn: 0.3818962	total: 1m 46s	remaining: 1m 45s
754:	learn: 0.3818503	total: 1m 46s	remaining: 1m 45s
755:	learn: 0.3818142	total: 1m 46s	remaining: 1m 45s
756:	learn: 0.3817610	total: 1m 47s	remaining: 1m 45s
757:	learn: 0.3817231	total: 1m 47s	remaining: 1m 44s
758:	learn: 0.3817003	total: 1m 47s	remaining: 1m 44s
759:	learn: 0.3816574	total: 1m 47s	remaining: 1m 44s
760:	learn: 0.3816069	total: 1m 47s	remaining: 1m 44s
761:	learn: 0.3815644	total: 1m 47s	remaining: 1m 44s
762:	learn: 0.3815136	total: 1m 47s	remaining: 1m 44s
763:	learn: 0.3814607	total: 1m 47s	remaining: 1m 43s
764:	learn: 0.3814281	total: 1m 48s	remaining: 1m 43s
765:	learn: 0.3813663	total: 1m 48s	remaining: 1m 43s
766:	learn: 0.3813120	total: 1m 48s	remaining: 1m 43s
767:	learn: 0.3812522	total: 1m 48s	remaining: 1m 43s
768:	learn: 0.3811912	total: 1m 48s	remaining: 1m 43s
769:	learn: 0.3811478	total: 1m 48s	remaining: 1m 43s
770:	learn: 0.3811003	total: 1m 48s	remaining: 1m 43s
771:	learn: 0.3810670	total: 1m 49s	remaining: 1m 42s
772:	learn: 0.3810181	total: 1m 49s	remaining: 1m 42s
773:	learn: 0.3809717	total: 1m 49s	remaining: 1m 42s
774:	learn: 0.3809269	total: 1m 49s	remaining: 1m 42s
775:	learn: 0.3808755	total: 1m 49s	remaining: 1m 42s
776:	learn: 0.3808276	total: 1m 49s	remaining: 1m 42s
777:	learn: 0.3807853	total: 1m 49s	remaining: 1m 42s
778:	learn: 0.3807235	total: 1m 50s	remaining: 1m 41s
779:	learn: 0.3806842	total: 1m 50s	remaining: 1m 41s
780:	learn: 0.3806260	total: 1m 50s	remaining: 1m 41s
781:	learn: 0.3805775	total: 1m 50s	remaining: 1m 41s
782:	learn: 0.3805161	total: 1m 50s	remaining: 1m 41s
783:	learn: 0.3804569	total: 1m 50s	remaining: 1m 41s
784:	learn: 0.3804007	total: 1m 50s	remaining: 1m 41s
785:	learn: 0.3803715	total: 1m 51s	remaining: 1m 40s
786:	learn: 0.3803262	total: 1m 51s	remaining: 1m 40s
787:	learn: 0.3802802	total: 1m 51s	remaining: 1m 40s
788:	learn: 0.3802572	total: 1m 51s	remaining: 1m 40s
789:	learn: 0.3802121	total: 1m 51s	remaining: 1m 40s
790:	learn: 0.3801709	total: 1m 51s	remaining: 1m 40s
791:	learn: 0.3801121	total: 1m 51s	remaining: 1m 39s
792:	learn: 0.3800630	total: 1m 51s	remaining: 1m 39s
793:	learn: 0.3800083	total: 1m 52s	remaining: 1m 39s
794:	learn: 0.3799740	total: 1m 52s	remaining: 1m 39s
795:	learn: 0.3799113	total: 1m 52s	remaining: 1m 39s
796:	learn: 0.3798570	total: 1m 52s	remaining: 1m 39s
797:	learn: 0.3798044	total: 1m 52s	remaining: 1m 39s
798:	learn: 0.3797516	total: 1m 52s	remaining: 1m 38s
799:	learn: 0.3796905	total: 1m 52s	remaining: 1m 38s
800:	learn: 0.3796396	total: 1m 53s	remaining: 1m 38s
801:	learn: 0.3795874	total: 1m 53s	remaining: 1m 38s
802:	learn: 0.3795420	total: 1m 53s	remaining: 1m 38s
803:	learn: 0.3794833	total: 1m 53s	remaining: 1m 38s
804:	learn: 0.3794482	total: 1m 53s	remaining: 1m 38s
805:	learn: 0.3793909	total: 1m 53s	remaining: 1m 38s
806:	learn: 0.3793254	total: 1m 53s	remaining: 1m 37s
807:	learn: 0.3792725	total: 1m 54s	remaining: 1m 37s
808:	learn: 0.3792242	total: 1m 54s	remaining: 1m 37s
809:	learn: 0.3791849	total: 1m 54s	remaining: 1m 37s
810:	learn: 0.3791365	total: 1m 54s	remaining: 1m 37s
811:	learn: 0.3790839	total: 1m 54s	remaining: 1m 37s
812:	learn: 0.3790300	total: 1m 54s	remaining: 1m 36s
813:	learn: 0.3789741	total: 1m 54s	remaining: 1m 36s
814:	learn: 0.3789215	total: 1m 55s	remaining: 1m 36s
815:	learn: 0.3788538	total: 1m 55s	remaining: 1m 36s
816:	learn: 0.3788111	total: 1m 55s	remaining: 1m 36s
817:	learn: 0.3787536	total: 1m 55s	remaining: 1m 36s
818:	learn: 0.3787023	total: 1m 55s	remaining: 1m 36s
819:	learn: 0.3786520	total: 1m 55s	remaining: 1m 35s
820:	learn: 0.3785895	total: 1m 55s	remaining: 1m 35s
821:	learn: 0.3785346	total: 1m 56s	remaining: 1m 35s
822:	learn: 0.3784930	total: 1m 56s	remaining: 1m 35s
823:	learn: 0.3784482	total: 1m 56s	remaining: 1m 35s
824:	learn: 0.3783949	total: 1m 56s	remaining: 1m 35s
825:	learn: 0.3783553	total: 1m 56s	remaining: 1m 35s
826:	learn: 0.3783004	total: 1m 56s	remaining: 1m 35s
827:	learn: 0.3782595	total: 1m 56s	remaining: 1m 34s
828:	learn: 0.3782095	total: 1m 57s	remaining: 1m 34s
829:	learn: 0.3781370	total: 1m 57s	remaining: 1m 34s
830:	learn: 0.3780790	total: 1m 57s	remaining: 1m 34s
831:	learn: 0.3780350	total: 1m 57s	remaining: 1m 34s
832:	learn: 0.3779880	total: 1m 57s	remaining: 1m 34s
833:	learn: 0.3779298	total: 1m 57s	remaining: 1m 34s
834:	learn: 0.3778821	total: 1m 58s	remaining: 1m 34s
835:	learn: 0.3778370	total: 1m 58s	remaining: 1m 33s
836:	learn: 0.3777997	total: 1m 58s	remaining: 1m 33s
837:	learn: 0.3777448	total: 1m 58s	remaining: 1m 33s
838:	learn: 0.3777115	total: 1m 58s	remaining: 1m 33s
839:	learn: 0.3776540	total: 1m 58s	remaining: 1m 33s
840:	learn: 0.3776067	total: 1m 58s	remaining: 1m 33s
841:	learn: 0.3775626	total: 1m 59s	remaining: 1m 33s
842:	learn: 0.3775205	total: 1m 59s	remaining: 1m 32s
843:	learn: 0.3774649	total: 1m 59s	remaining: 1m 32s
844:	learn: 0.3774161	total: 1m 59s	remaining: 1m 32s
845:	learn: 0.3773858	total: 1m 59s	remaining: 1m 32s
846:	learn: 0.3773375	total: 1m 59s	remaining: 1m 32s
847:	learn: 0.3772796	total: 1m 59s	remaining: 1m 32s
848:	learn: 0.3772305	total: 1m 59s	remaining: 1m 31s
849:	learn: 0.3771775	total: 2m	remaining: 1m 31s
850:	learn: 0.3771265	total: 2m	remaining: 1m 31s
851:	learn: 0.3770757	total: 2m	remaining: 1m 31s
852:	learn: 0.3770181	total: 2m	remaining: 1m 31s
853:	learn: 0.3769821	total: 2m	remaining: 1m 31s
854:	learn: 0.3769270	total: 2m	remaining: 1m 31s
855:	learn: 0.3768786	total: 2m	remaining: 1m 30s
856:	learn: 0.3768320	total: 2m	remaining: 1m 30s
857:	learn: 0.3767843	total: 2m 1s	remaining: 1m 30s
858:	learn: 0.3767472	total: 2m 1s	remaining: 1m 30s
859:	learn: 0.3767023	total: 2m 1s	remaining: 1m 30s
860:	learn: 0.3766468	total: 2m 1s	remaining: 1m 30s
861:	learn: 0.3766092	total: 2m 1s	remaining: 1m 29s
862:	learn: 0.3765695	total: 2m 1s	remaining: 1m 29s
863:	learn: 0.3765203	total: 2m 1s	remaining: 1m 29s
864:	learn: 0.3764687	total: 2m 1s	remaining: 1m 29s
865:	learn: 0.3764251	total: 2m 2s	remaining: 1m 29s
866:	learn: 0.3763784	total: 2m 2s	remaining: 1m 29s
867:	learn: 0.3763202	total: 2m 2s	remaining: 1m 29s
868:	learn: 0.3762819	total: 2m 2s	remaining: 1m 28s
869:	learn: 0.3762483	total: 2m 2s	remaining: 1m 28s
870:	learn: 0.3761961	total: 2m 2s	remaining: 1m 28s
871:	learn: 0.3761711	total: 2m 2s	remaining: 1m 28s
872:	learn: 0.3761070	total: 2m 2s	remaining: 1m 28s
873:	learn: 0.3760588	total: 2m 3s	remaining: 1m 28s
874:	learn: 0.3760092	total: 2m 3s	remaining: 1m 27s
875:	learn: 0.3759718	total: 2m 3s	remaining: 1m 27s
876:	learn: 0.3759354	total: 2m 3s	remaining: 1m 27s
877:	learn: 0.3758722	total: 2m 3s	remaining: 1m 27s
878:	learn: 0.3758230	total: 2m 3s	remaining: 1m 27s
879:	learn: 0.3757733	total: 2m 3s	remaining: 1m 27s
880:	learn: 0.3757254	total: 2m 3s	remaining: 1m 27s
881:	learn: 0.3756728	total: 2m 4s	remaining: 1m 26s
882:	learn: 0.3756290	total: 2m 4s	remaining: 1m 26s
883:	learn: 0.3755786	total: 2m 4s	remaining: 1m 26s
884:	learn: 0.3755347	total: 2m 4s	remaining: 1m 26s
885:	learn: 0.3754790	total: 2m 4s	remaining: 1m 26s
886:	learn: 0.3754322	total: 2m 4s	remaining: 1m 26s
887:	learn: 0.3753883	total: 2m 4s	remaining: 1m 26s
888:	learn: 0.3753507	total: 2m 4s	remaining: 1m 25s
889:	learn: 0.3753197	total: 2m 5s	remaining: 1m 25s
890:	learn: 0.3752718	total: 2m 5s	remaining: 1m 25s
891:	learn: 0.3752334	total: 2m 5s	remaining: 1m 25s
892:	learn: 0.3751809	total: 2m 5s	remaining: 1m 25s
893:	learn: 0.3751323	total: 2m 5s	remaining: 1m 25s
894:	learn: 0.3750810	total: 2m 5s	remaining: 1m 24s
895:	learn: 0.3750131	total: 2m 5s	remaining: 1m 24s
896:	learn: 0.3749676	total: 2m 5s	remaining: 1m 24s
897:	learn: 0.3749132	total: 2m 6s	remaining: 1m 24s
898:	learn: 0.3748803	total: 2m 6s	remaining: 1m 24s
899:	learn: 0.3748418	total: 2m 6s	remaining: 1m 24s
900:	learn: 0.3748044	total: 2m 6s	remaining: 1m 24s
901:	learn: 0.3747570	total: 2m 6s	remaining: 1m 23s
902:	learn: 0.3747040	total: 2m 6s	remaining: 1m 23s
903:	learn: 0.3746664	total: 2m 6s	remaining: 1m 23s
904:	learn: 0.3746141	total: 2m 6s	remaining: 1m 23s
905:	learn: 0.3745794	total: 2m 7s	remaining: 1m 23s
906:	learn: 0.3745232	total: 2m 7s	remaining: 1m 23s
907:	learn: 0.3744786	total: 2m 7s	remaining: 1m 22s
908:	learn: 0.3744320	total: 2m 7s	remaining: 1m 22s
909:	learn: 0.3743816	total: 2m 7s	remaining: 1m 22s
910:	learn: 0.3743265	total: 2m 7s	remaining: 1m 22s
911:	learn: 0.3742721	total: 2m 7s	remaining: 1m 22s
912:	learn: 0.3742318	total: 2m 7s	remaining: 1m 22s
913:	learn: 0.3741942	total: 2m 8s	remaining: 1m 22s
914:	learn: 0.3741434	total: 2m 8s	remaining: 1m 21s
915:	learn: 0.3740973	total: 2m 8s	remaining: 1m 21s
916:	learn: 0.3740535	total: 2m 8s	remaining: 1m 21s
917:	learn: 0.3740059	total: 2m 8s	remaining: 1m 21s
918:	learn: 0.3739532	total: 2m 8s	remaining: 1m 21s
919:	learn: 0.3739272	total: 2m 8s	remaining: 1m 21s
920:	learn: 0.3738771	total: 2m 9s	remaining: 1m 21s
921:	learn: 0.3738265	total: 2m 9s	remaining: 1m 20s
922:	learn: 0.3737652	total: 2m 9s	remaining: 1m 20s
923:	learn: 0.3737182	total: 2m 9s	remaining: 1m 20s
924:	learn: 0.3736817	total: 2m 9s	remaining: 1m 20s
925:	learn: 0.3736365	total: 2m 9s	remaining: 1m 20s
926:	learn: 0.3735920	total: 2m 9s	remaining: 1m 20s
927:	learn: 0.3735592	total: 2m 10s	remaining: 1m 20s
928:	learn: 0.3735062	total: 2m 10s	remaining: 1m 20s
929:	learn: 0.3734557	total: 2m 10s	remaining: 1m 19s
930:	learn: 0.3734056	total: 2m 10s	remaining: 1m 19s
931:	learn: 0.3733676	total: 2m 10s	remaining: 1m 19s
932:	learn: 0.3733253	total: 2m 10s	remaining: 1m 19s
933:	learn: 0.3732731	total: 2m 10s	remaining: 1m 19s
934:	learn: 0.3732247	total: 2m 11s	remaining: 1m 19s
935:	learn: 0.3731840	total: 2m 11s	remaining: 1m 19s
936:	learn: 0.3731379	total: 2m 11s	remaining: 1m 18s
937:	learn: 0.3730923	total: 2m 11s	remaining: 1m 18s
938:	learn: 0.3730461	total: 2m 11s	remaining: 1m 18s
939:	learn: 0.3729955	total: 2m 11s	remaining: 1m 18s
940:	learn: 0.3729593	total: 2m 11s	remaining: 1m 18s
941:	learn: 0.3729272	total: 2m 11s	remaining: 1m 18s
942:	learn: 0.3728791	total: 2m 12s	remaining: 1m 18s
943:	learn: 0.3728274	total: 2m 12s	remaining: 1m 17s
944:	learn: 0.3727958	total: 2m 12s	remaining: 1m 17s
945:	learn: 0.3727491	total: 2m 12s	remaining: 1m 17s
946:	learn: 0.3727125	total: 2m 12s	remaining: 1m 17s
947:	learn: 0.3726754	total: 2m 12s	remaining: 1m 17s
948:	learn: 0.3726201	total: 2m 12s	remaining: 1m 17s
949:	learn: 0.3725552	total: 2m 12s	remaining: 1m 16s
950:	learn: 0.3725202	total: 2m 12s	remaining: 1m 16s
951:	learn: 0.3724794	total: 2m 13s	remaining: 1m 16s
952:	learn: 0.3724507	total: 2m 13s	remaining: 1m 16s
953:	learn: 0.3724152	total: 2m 13s	remaining: 1m 16s
954:	learn: 0.3723727	total: 2m 13s	remaining: 1m 16s
955:	learn: 0.3723302	total: 2m 13s	remaining: 1m 15s
956:	learn: 0.3722878	total: 2m 13s	remaining: 1m 15s
957:	learn: 0.3722417	total: 2m 13s	remaining: 1m 15s
958:	learn: 0.3722087	total: 2m 13s	remaining: 1m 15s
959:	learn: 0.3721630	total: 2m 13s	remaining: 1m 15s
960:	learn: 0.3721376	total: 2m 14s	remaining: 1m 15s
961:	learn: 0.3720912	total: 2m 14s	remaining: 1m 15s
962:	learn: 0.3720397	total: 2m 14s	remaining: 1m 14s
963:	learn: 0.3719880	total: 2m 14s	remaining: 1m 14s
964:	learn: 0.3719466	total: 2m 14s	remaining: 1m 14s
965:	learn: 0.3719076	total: 2m 14s	remaining: 1m 14s
966:	learn: 0.3718611	total: 2m 14s	remaining: 1m 14s
967:	learn: 0.3718066	total: 2m 14s	remaining: 1m 14s
968:	learn: 0.3717702	total: 2m 15s	remaining: 1m 14s
969:	learn: 0.3717320	total: 2m 15s	remaining: 1m 13s
970:	learn: 0.3716705	total: 2m 15s	remaining: 1m 13s
971:	learn: 0.3716315	total: 2m 15s	remaining: 1m 13s
972:	learn: 0.3715771	total: 2m 15s	remaining: 1m 13s
973:	learn: 0.3715454	total: 2m 15s	remaining: 1m 13s
974:	learn: 0.3715034	total: 2m 15s	remaining: 1m 13s
975:	learn: 0.3714529	total: 2m 15s	remaining: 1m 12s
976:	learn: 0.3714082	total: 2m 16s	remaining: 1m 12s
977:	learn: 0.3713605	total: 2m 16s	remaining: 1m 12s
978:	learn: 0.3713146	total: 2m 16s	remaining: 1m 12s
979:	learn: 0.3712885	total: 2m 16s	remaining: 1m 12s
980:	learn: 0.3712361	total: 2m 16s	remaining: 1m 12s
981:	learn: 0.3711843	total: 2m 16s	remaining: 1m 12s
982:	learn: 0.3711450	total: 2m 16s	remaining: 1m 11s
983:	learn: 0.3710978	total: 2m 16s	remaining: 1m 11s
984:	learn: 0.3710645	total: 2m 16s	remaining: 1m 11s
985:	learn: 0.3709819	total: 2m 17s	remaining: 1m 11s
986:	learn: 0.3709420	total: 2m 17s	remaining: 1m 11s
987:	learn: 0.3709145	total: 2m 17s	remaining: 1m 11s
988:	learn: 0.3708820	total: 2m 17s	remaining: 1m 11s
989:	learn: 0.3708373	total: 2m 17s	remaining: 1m 10s
990:	learn: 0.3707963	total: 2m 17s	remaining: 1m 10s
991:	learn: 0.3707453	total: 2m 17s	remaining: 1m 10s
992:	learn: 0.3706996	total: 2m 17s	remaining: 1m 10s
993:	learn: 0.3706653	total: 2m 18s	remaining: 1m 10s
994:	learn: 0.3706200	total: 2m 18s	remaining: 1m 10s
995:	learn: 0.3705678	total: 2m 18s	remaining: 1m 9s
996:	learn: 0.3705260	total: 2m 18s	remaining: 1m 9s
997:	learn: 0.3704745	total: 2m 18s	remaining: 1m 9s
998:	learn: 0.3704323	total: 2m 18s	remaining: 1m 9s
999:	learn: 0.3703832	total: 2m 18s	remaining: 1m 9s
1000:	learn: 0.3703476	total: 2m 18s	remaining: 1m 9s
1001:	learn: 0.3703173	total: 2m 18s	remaining: 1m 9s
1002:	learn: 0.3702872	total: 2m 19s	remaining: 1m 8s
1003:	learn: 0.3702493	total: 2m 19s	remaining: 1m 8s
1004:	learn: 0.3701856	total: 2m 19s	remaining: 1m 8s
1005:	learn: 0.3701406	total: 2m 19s	remaining: 1m 8s
1006:	learn: 0.3700876	total: 2m 19s	remaining: 1m 8s
1007:	learn: 0.3700623	total: 2m 19s	remaining: 1m 8s
1008:	learn: 0.3700152	total: 2m 19s	remaining: 1m 8s
1009:	learn: 0.3699703	total: 2m 19s	remaining: 1m 7s
1010:	learn: 0.3699200	total: 2m 20s	remaining: 1m 7s
1011:	learn: 0.3698963	total: 2m 20s	remaining: 1m 7s
1012:	learn: 0.3698412	total: 2m 20s	remaining: 1m 7s
1013:	learn: 0.3698121	total: 2m 20s	remaining: 1m 7s
1014:	learn: 0.3697613	total: 2m 20s	remaining: 1m 7s
1015:	learn: 0.3697298	total: 2m 20s	remaining: 1m 6s
1016:	learn: 0.3696807	total: 2m 20s	remaining: 1m 6s
1017:	learn: 0.3696420	total: 2m 20s	remaining: 1m 6s
1018:	learn: 0.3696123	total: 2m 20s	remaining: 1m 6s
1019:	learn: 0.3695632	total: 2m 21s	remaining: 1m 6s
1020:	learn: 0.3695086	total: 2m 21s	remaining: 1m 6s
1021:	learn: 0.3694540	total: 2m 21s	remaining: 1m 6s
1022:	learn: 0.3694216	total: 2m 21s	remaining: 1m 5s
1023:	learn: 0.3693711	total: 2m 21s	remaining: 1m 5s
1024:	learn: 0.3693178	total: 2m 21s	remaining: 1m 5s
1025:	learn: 0.3692757	total: 2m 21s	remaining: 1m 5s
1026:	learn: 0.3692259	total: 2m 21s	remaining: 1m 5s
1027:	learn: 0.3691842	total: 2m 22s	remaining: 1m 5s
1028:	learn: 0.3691302	total: 2m 22s	remaining: 1m 5s
1029:	learn: 0.3690781	total: 2m 22s	remaining: 1m 4s
1030:	learn: 0.3690232	total: 2m 22s	remaining: 1m 4s
1031:	learn: 0.3689864	total: 2m 22s	remaining: 1m 4s
1032:	learn: 0.3689461	total: 2m 22s	remaining: 1m 4s
1033:	learn: 0.3689000	total: 2m 22s	remaining: 1m 4s
1034:	learn: 0.3688516	total: 2m 22s	remaining: 1m 4s
1035:	learn: 0.3688058	total: 2m 23s	remaining: 1m 4s
1036:	learn: 0.3687516	total: 2m 23s	remaining: 1m 3s
1037:	learn: 0.3687044	total: 2m 23s	remaining: 1m 3s
1038:	learn: 0.3686679	total: 2m 23s	remaining: 1m 3s
1039:	learn: 0.3686324	total: 2m 23s	remaining: 1m 3s
1040:	learn: 0.3685902	total: 2m 23s	remaining: 1m 3s
1041:	learn: 0.3685460	total: 2m 23s	remaining: 1m 3s
1042:	learn: 0.3685154	total: 2m 23s	remaining: 1m 3s
1043:	learn: 0.3684777	total: 2m 24s	remaining: 1m 2s
1044:	learn: 0.3684232	total: 2m 24s	remaining: 1m 2s
1045:	learn: 0.3683872	total: 2m 24s	remaining: 1m 2s
1046:	learn: 0.3683425	total: 2m 24s	remaining: 1m 2s
1047:	learn: 0.3682924	total: 2m 24s	remaining: 1m 2s
1048:	learn: 0.3682515	total: 2m 24s	remaining: 1m 2s
1049:	learn: 0.3682078	total: 2m 24s	remaining: 1m 2s
1050:	learn: 0.3681787	total: 2m 24s	remaining: 1m 1s
1051:	learn: 0.3681414	total: 2m 25s	remaining: 1m 1s
1052:	learn: 0.3680864	total: 2m 25s	remaining: 1m 1s
1053:	learn: 0.3680335	total: 2m 25s	remaining: 1m 1s
1054:	learn: 0.3679991	total: 2m 25s	remaining: 1m 1s
1055:	learn: 0.3679479	total: 2m 25s	remaining: 1m 1s
1056:	learn: 0.3679104	total: 2m 25s	remaining: 1m 1s
1057:	learn: 0.3678710	total: 2m 25s	remaining: 1m
1058:	learn: 0.3678362	total: 2m 26s	remaining: 1m
1059:	learn: 0.3677865	total: 2m 26s	remaining: 1m
1060:	learn: 0.3677512	total: 2m 26s	remaining: 1m
1061:	learn: 0.3677087	total: 2m 26s	remaining: 1m
1062:	learn: 0.3676614	total: 2m 26s	remaining: 1m
1063:	learn: 0.3676231	total: 2m 26s	remaining: 1m
1064:	learn: 0.3675946	total: 2m 26s	remaining: 60s
1065:	learn: 0.3675595	total: 2m 26s	remaining: 59.8s
1066:	learn: 0.3675079	total: 2m 27s	remaining: 59.7s
1067:	learn: 0.3674783	total: 2m 27s	remaining: 59.5s
1068:	learn: 0.3674416	total: 2m 27s	remaining: 59.4s
1069:	learn: 0.3674091	total: 2m 27s	remaining: 59.2s
1070:	learn: 0.3673621	total: 2m 27s	remaining: 59.1s
1071:	learn: 0.3673557	total: 2m 27s	remaining: 58.9s
1072:	learn: 0.3673074	total: 2m 27s	remaining: 58.8s
1073:	learn: 0.3672626	total: 2m 27s	remaining: 58.6s
1074:	learn: 0.3672174	total: 2m 27s	remaining: 58.5s
1075:	learn: 0.3671764	total: 2m 28s	remaining: 58.4s
1076:	learn: 0.3671340	total: 2m 28s	remaining: 58.2s
1077:	learn: 0.3670900	total: 2m 28s	remaining: 58.1s
1078:	learn: 0.3670404	total: 2m 28s	remaining: 57.9s
1079:	learn: 0.3669954	total: 2m 28s	remaining: 57.8s
1080:	learn: 0.3669474	total: 2m 28s	remaining: 57.7s
1081:	learn: 0.3669106	total: 2m 28s	remaining: 57.5s
1082:	learn: 0.3668775	total: 2m 29s	remaining: 57.4s
1083:	learn: 0.3668256	total: 2m 29s	remaining: 57.3s
1084:	learn: 0.3668024	total: 2m 29s	remaining: 57.1s
1085:	learn: 0.3667581	total: 2m 29s	remaining: 57s
1086:	learn: 0.3667177	total: 2m 29s	remaining: 56.9s
1087:	learn: 0.3666670	total: 2m 29s	remaining: 56.7s
1088:	learn: 0.3666148	total: 2m 29s	remaining: 56.6s
1089:	learn: 0.3665645	total: 2m 29s	remaining: 56.4s
1090:	learn: 0.3665158	total: 2m 30s	remaining: 56.3s
1091:	learn: 0.3664683	total: 2m 30s	remaining: 56.1s
1092:	learn: 0.3664250	total: 2m 30s	remaining: 56s
1093:	learn: 0.3663731	total: 2m 30s	remaining: 55.8s
1094:	learn: 0.3663287	total: 2m 30s	remaining: 55.7s
1095:	learn: 0.3662812	total: 2m 30s	remaining: 55.6s
1096:	learn: 0.3662402	total: 2m 30s	remaining: 55.4s
1097:	learn: 0.3662114	total: 2m 30s	remaining: 55.3s
1098:	learn: 0.3661682	total: 2m 31s	remaining: 55.1s
1099:	learn: 0.3661417	total: 2m 31s	remaining: 55s
1100:	learn: 0.3660912	total: 2m 31s	remaining: 54.9s
1101:	learn: 0.3660409	total: 2m 31s	remaining: 54.7s
1102:	learn: 0.3659835	total: 2m 31s	remaining: 54.6s
1103:	learn: 0.3659409	total: 2m 31s	remaining: 54.4s
1104:	learn: 0.3659353	total: 2m 31s	remaining: 54.3s
1105:	learn: 0.3658807	total: 2m 31s	remaining: 54.1s
1106:	learn: 0.3658397	total: 2m 32s	remaining: 54s
1107:	learn: 0.3658045	total: 2m 32s	remaining: 53.8s
1108:	learn: 0.3657725	total: 2m 32s	remaining: 53.7s
1109:	learn: 0.3657369	total: 2m 32s	remaining: 53.5s
1110:	learn: 0.3656899	total: 2m 32s	remaining: 53.4s
1111:	learn: 0.3656517	total: 2m 32s	remaining: 53.2s
1112:	learn: 0.3656131	total: 2m 32s	remaining: 53.1s
1113:	learn: 0.3655695	total: 2m 32s	remaining: 53s
1114:	learn: 0.3655289	total: 2m 32s	remaining: 52.8s
1115:	learn: 0.3654923	total: 2m 33s	remaining: 52.7s
1116:	learn: 0.3654617	total: 2m 33s	remaining: 52.5s
1117:	learn: 0.3654370	total: 2m 33s	remaining: 52.4s
1118:	learn: 0.3654085	total: 2m 33s	remaining: 52.2s
1119:	learn: 0.3653684	total: 2m 33s	remaining: 52.1s
1120:	learn: 0.3653318	total: 2m 33s	remaining: 51.9s
1121:	learn: 0.3652957	total: 2m 33s	remaining: 51.8s
1122:	learn: 0.3652616	total: 2m 33s	remaining: 51.7s
1123:	learn: 0.3652221	total: 2m 33s	remaining: 51.5s
1124:	learn: 0.3651866	total: 2m 34s	remaining: 51.4s
1125:	learn: 0.3651400	total: 2m 34s	remaining: 51.2s
1126:	learn: 0.3650910	total: 2m 34s	remaining: 51.1s
1127:	learn: 0.3650501	total: 2m 34s	remaining: 50.9s
1128:	learn: 0.3649816	total: 2m 34s	remaining: 50.8s
1129:	learn: 0.3649421	total: 2m 34s	remaining: 50.7s
1130:	learn: 0.3648948	total: 2m 34s	remaining: 50.5s
1131:	learn: 0.3648482	total: 2m 34s	remaining: 50.4s
1132:	learn: 0.3648021	total: 2m 35s	remaining: 50.2s
1133:	learn: 0.3647590	total: 2m 35s	remaining: 50.1s
1134:	learn: 0.3647036	total: 2m 35s	remaining: 50s
1135:	learn: 0.3646559	total: 2m 35s	remaining: 49.8s
1136:	learn: 0.3646301	total: 2m 35s	remaining: 49.7s
1137:	learn: 0.3645880	total: 2m 35s	remaining: 49.5s
1138:	learn: 0.3645478	total: 2m 35s	remaining: 49.4s
1139:	learn: 0.3645027	total: 2m 35s	remaining: 49.2s
1140:	learn: 0.3644755	total: 2m 36s	remaining: 49.1s
1141:	learn: 0.3644238	total: 2m 36s	remaining: 49s
1142:	learn: 0.3643815	total: 2m 36s	remaining: 48.8s
1143:	learn: 0.3643457	total: 2m 36s	remaining: 48.7s
1144:	learn: 0.3643095	total: 2m 36s	remaining: 48.5s
1145:	learn: 0.3642545	total: 2m 36s	remaining: 48.4s
1146:	learn: 0.3642205	total: 2m 36s	remaining: 48.3s
1147:	learn: 0.3641894	total: 2m 36s	remaining: 48.1s
1148:	learn: 0.3641417	total: 2m 37s	remaining: 48s
1149:	learn: 0.3640965	total: 2m 37s	remaining: 47.8s
1150:	learn: 0.3640561	total: 2m 37s	remaining: 47.7s
1151:	learn: 0.3640332	total: 2m 37s	remaining: 47.6s
1152:	learn: 0.3639926	total: 2m 37s	remaining: 47.4s
1153:	learn: 0.3639463	total: 2m 37s	remaining: 47.3s
1154:	learn: 0.3639008	total: 2m 37s	remaining: 47.1s
1155:	learn: 0.3638564	total: 2m 37s	remaining: 47s
1156:	learn: 0.3638223	total: 2m 38s	remaining: 46.8s
1157:	learn: 0.3637885	total: 2m 38s	remaining: 46.7s
1158:	learn: 0.3637487	total: 2m 38s	remaining: 46.6s
1159:	learn: 0.3636966	total: 2m 38s	remaining: 46.4s
1160:	learn: 0.3636469	total: 2m 38s	remaining: 46.3s
1161:	learn: 0.3635986	total: 2m 38s	remaining: 46.2s
1162:	learn: 0.3635477	total: 2m 38s	remaining: 46s
1163:	learn: 0.3635080	total: 2m 38s	remaining: 45.9s
1164:	learn: 0.3634687	total: 2m 39s	remaining: 45.7s
1165:	learn: 0.3634272	total: 2m 39s	remaining: 45.6s
1166:	learn: 0.3633765	total: 2m 39s	remaining: 45.5s
1167:	learn: 0.3633206	total: 2m 39s	remaining: 45.3s
1168:	learn: 0.3633003	total: 2m 39s	remaining: 45.2s
1169:	learn: 0.3632774	total: 2m 39s	remaining: 45s
1170:	learn: 0.3632383	total: 2m 39s	remaining: 44.9s
1171:	learn: 0.3632018	total: 2m 39s	remaining: 44.8s
1172:	learn: 0.3631484	total: 2m 40s	remaining: 44.6s
1173:	learn: 0.3630965	total: 2m 40s	remaining: 44.5s
1174:	learn: 0.3630634	total: 2m 40s	remaining: 44.4s
1175:	learn: 0.3630178	total: 2m 40s	remaining: 44.2s
1176:	learn: 0.3629847	total: 2m 40s	remaining: 44.1s
1177:	learn: 0.3629340	total: 2m 40s	remaining: 44s
1178:	learn: 0.3628861	total: 2m 40s	remaining: 43.8s
1179:	learn: 0.3628312	total: 2m 41s	remaining: 43.7s
1180:	learn: 0.3627962	total: 2m 41s	remaining: 43.6s
1181:	learn: 0.3627659	total: 2m 41s	remaining: 43.4s
1182:	learn: 0.3627261	total: 2m 41s	remaining: 43.3s
1183:	learn: 0.3627039	total: 2m 41s	remaining: 43.1s
1184:	learn: 0.3626448	total: 2m 41s	remaining: 43s
1185:	learn: 0.3626103	total: 2m 41s	remaining: 42.9s
1186:	learn: 0.3625572	total: 2m 42s	remaining: 42.7s
1187:	learn: 0.3625112	total: 2m 42s	remaining: 42.6s
1188:	learn: 0.3624792	total: 2m 42s	remaining: 42.5s
1189:	learn: 0.3624335	total: 2m 42s	remaining: 42.3s
1190:	learn: 0.3624086	total: 2m 42s	remaining: 42.2s
1191:	learn: 0.3623666	total: 2m 42s	remaining: 42s
1192:	learn: 0.3623195	total: 2m 42s	remaining: 41.9s
1193:	learn: 0.3622561	total: 2m 42s	remaining: 41.8s
1194:	learn: 0.3622092	total: 2m 43s	remaining: 41.6s
1195:	learn: 0.3621581	total: 2m 43s	remaining: 41.5s
1196:	learn: 0.3621199	total: 2m 43s	remaining: 41.3s
1197:	learn: 0.3620736	total: 2m 43s	remaining: 41.2s
1198:	learn: 0.3620234	total: 2m 43s	remaining: 41.1s
1199:	learn: 0.3619775	total: 2m 43s	remaining: 40.9s
1200:	learn: 0.3619252	total: 2m 43s	remaining: 40.8s
1201:	learn: 0.3618851	total: 2m 43s	remaining: 40.6s
1202:	learn: 0.3618526	total: 2m 44s	remaining: 40.5s
1203:	learn: 0.3618062	total: 2m 44s	remaining: 40.4s
1204:	learn: 0.3617784	total: 2m 44s	remaining: 40.2s
1205:	learn: 0.3617421	total: 2m 44s	remaining: 40.1s
1206:	learn: 0.3617058	total: 2m 44s	remaining: 39.9s
1207:	learn: 0.3616574	total: 2m 44s	remaining: 39.8s
1208:	learn: 0.3616155	total: 2m 44s	remaining: 39.7s
1209:	learn: 0.3615690	total: 2m 44s	remaining: 39.5s
1210:	learn: 0.3615198	total: 2m 45s	remaining: 39.4s
1211:	learn: 0.3614730	total: 2m 45s	remaining: 39.2s
1212:	learn: 0.3614297	total: 2m 45s	remaining: 39.1s
1213:	learn: 0.3613842	total: 2m 45s	remaining: 39s
1214:	learn: 0.3613326	total: 2m 45s	remaining: 38.8s
1215:	learn: 0.3612999	total: 2m 45s	remaining: 38.7s
1216:	learn: 0.3612538	total: 2m 45s	remaining: 38.5s
1217:	learn: 0.3612136	total: 2m 45s	remaining: 38.4s
1218:	learn: 0.3611669	total: 2m 45s	remaining: 38.3s
1219:	learn: 0.3611366	total: 2m 46s	remaining: 38.1s
1220:	learn: 0.3610966	total: 2m 46s	remaining: 38s
1221:	learn: 0.3610418	total: 2m 46s	remaining: 37.9s
1222:	learn: 0.3610039	total: 2m 46s	remaining: 37.7s
1223:	learn: 0.3609608	total: 2m 46s	remaining: 37.6s
1224:	learn: 0.3609179	total: 2m 46s	remaining: 37.4s
1225:	learn: 0.3608698	total: 2m 46s	remaining: 37.3s
1226:	learn: 0.3608225	total: 2m 47s	remaining: 37.2s
1227:	learn: 0.3607768	total: 2m 47s	remaining: 37s
1228:	learn: 0.3607385	total: 2m 47s	remaining: 36.9s
1229:	learn: 0.3606952	total: 2m 47s	remaining: 36.7s
1230:	learn: 0.3606468	total: 2m 47s	remaining: 36.6s
1231:	learn: 0.3606105	total: 2m 47s	remaining: 36.5s
1232:	learn: 0.3605781	total: 2m 47s	remaining: 36.3s
1233:	learn: 0.3605296	total: 2m 47s	remaining: 36.2s
1234:	learn: 0.3604933	total: 2m 47s	remaining: 36s
1235:	learn: 0.3604530	total: 2m 48s	remaining: 35.9s
1236:	learn: 0.3604142	total: 2m 48s	remaining: 35.8s
1237:	learn: 0.3603673	total: 2m 48s	remaining: 35.6s
1238:	learn: 0.3603310	total: 2m 48s	remaining: 35.5s
1239:	learn: 0.3602833	total: 2m 48s	remaining: 35.3s
1240:	learn: 0.3602371	total: 2m 48s	remaining: 35.2s
1241:	learn: 0.3601981	total: 2m 48s	remaining: 35.1s
1242:	learn: 0.3601563	total: 2m 48s	remaining: 34.9s
1243:	learn: 0.3601144	total: 2m 49s	remaining: 34.8s
1244:	learn: 0.3600780	total: 2m 49s	remaining: 34.6s
1245:	learn: 0.3600301	total: 2m 49s	remaining: 34.5s
1246:	learn: 0.3600037	total: 2m 49s	remaining: 34.4s
1247:	learn: 0.3599643	total: 2m 49s	remaining: 34.2s
1248:	learn: 0.3599293	total: 2m 49s	remaining: 34.1s
1249:	learn: 0.3598949	total: 2m 49s	remaining: 33.9s
1250:	learn: 0.3598455	total: 2m 49s	remaining: 33.8s
1251:	learn: 0.3598012	total: 2m 49s	remaining: 33.7s
1252:	learn: 0.3597568	total: 2m 50s	remaining: 33.5s
1253:	learn: 0.3597271	total: 2m 50s	remaining: 33.4s
1254:	learn: 0.3596780	total: 2m 50s	remaining: 33.2s
1255:	learn: 0.3596289	total: 2m 50s	remaining: 33.1s
1256:	learn: 0.3595973	total: 2m 50s	remaining: 33s
1257:	learn: 0.3595601	total: 2m 50s	remaining: 32.8s
1258:	learn: 0.3595106	total: 2m 50s	remaining: 32.7s
1259:	learn: 0.3594902	total: 2m 50s	remaining: 32.5s
1260:	learn: 0.3594535	total: 2m 50s	remaining: 32.4s
1261:	learn: 0.3594091	total: 2m 51s	remaining: 32.3s
1262:	learn: 0.3593658	total: 2m 51s	remaining: 32.1s
1263:	learn: 0.3593305	total: 2m 51s	remaining: 32s
1264:	learn: 0.3592828	total: 2m 51s	remaining: 31.9s
1265:	learn: 0.3592450	total: 2m 51s	remaining: 31.7s
1266:	learn: 0.3591946	total: 2m 51s	remaining: 31.6s
1267:	learn: 0.3591519	total: 2m 51s	remaining: 31.4s
1268:	learn: 0.3591101	total: 2m 51s	remaining: 31.3s
1269:	learn: 0.3590734	total: 2m 52s	remaining: 31.2s
1270:	learn: 0.3590296	total: 2m 52s	remaining: 31s
1271:	learn: 0.3589971	total: 2m 52s	remaining: 30.9s
1272:	learn: 0.3589615	total: 2m 52s	remaining: 30.7s
1273:	learn: 0.3589324	total: 2m 52s	remaining: 30.6s
1274:	learn: 0.3589000	total: 2m 52s	remaining: 30.5s
1275:	learn: 0.3588708	total: 2m 52s	remaining: 30.3s
1276:	learn: 0.3588344	total: 2m 52s	remaining: 30.2s
1277:	learn: 0.3587915	total: 2m 52s	remaining: 30s
1278:	learn: 0.3587397	total: 2m 53s	remaining: 29.9s
1279:	learn: 0.3587097	total: 2m 53s	remaining: 29.8s
1280:	learn: 0.3586698	total: 2m 53s	remaining: 29.6s
1281:	learn: 0.3586220	total: 2m 53s	remaining: 29.5s
1282:	learn: 0.3585725	total: 2m 53s	remaining: 29.3s
1283:	learn: 0.3585316	total: 2m 53s	remaining: 29.2s
1284:	learn: 0.3584872	total: 2m 53s	remaining: 29.1s
1285:	learn: 0.3584354	total: 2m 53s	remaining: 28.9s
1286:	learn: 0.3583956	total: 2m 53s	remaining: 28.8s
1287:	learn: 0.3583536	total: 2m 54s	remaining: 28.6s
1288:	learn: 0.3583167	total: 2m 54s	remaining: 28.5s
1289:	learn: 0.3582725	total: 2m 54s	remaining: 28.4s
1290:	learn: 0.3582497	total: 2m 54s	remaining: 28.2s
1291:	learn: 0.3581999	total: 2m 54s	remaining: 28.1s
1292:	learn: 0.3581528	total: 2m 54s	remaining: 28s
1293:	learn: 0.3581043	total: 2m 54s	remaining: 27.8s
1294:	learn: 0.3580620	total: 2m 54s	remaining: 27.7s
1295:	learn: 0.3580171	total: 2m 54s	remaining: 27.5s
1296:	learn: 0.3579760	total: 2m 55s	remaining: 27.4s
1297:	learn: 0.3579236	total: 2m 55s	remaining: 27.3s
1298:	learn: 0.3578892	total: 2m 55s	remaining: 27.1s
1299:	learn: 0.3578417	total: 2m 55s	remaining: 27s
1300:	learn: 0.3577944	total: 2m 55s	remaining: 26.9s
1301:	learn: 0.3577466	total: 2m 55s	remaining: 26.7s
1302:	learn: 0.3577098	total: 2m 55s	remaining: 26.6s
1303:	learn: 0.3576609	total: 2m 56s	remaining: 26.5s
1304:	learn: 0.3576177	total: 2m 56s	remaining: 26.3s
1305:	learn: 0.3575791	total: 2m 56s	remaining: 26.2s
1306:	learn: 0.3575398	total: 2m 56s	remaining: 26.1s
1307:	learn: 0.3574965	total: 2m 56s	remaining: 25.9s
1308:	learn: 0.3574526	total: 2m 56s	remaining: 25.8s
1309:	learn: 0.3574146	total: 2m 56s	remaining: 25.7s
1310:	learn: 0.3573924	total: 2m 57s	remaining: 25.5s
1311:	learn: 0.3573642	total: 2m 57s	remaining: 25.4s
1312:	learn: 0.3573364	total: 2m 57s	remaining: 25.2s
1313:	learn: 0.3572943	total: 2m 57s	remaining: 25.1s
1314:	learn: 0.3572655	total: 2m 57s	remaining: 25s
1315:	learn: 0.3572273	total: 2m 57s	remaining: 24.8s
1316:	learn: 0.3571871	total: 2m 57s	remaining: 24.7s
1317:	learn: 0.3571431	total: 2m 57s	remaining: 24.6s
1318:	learn: 0.3571102	total: 2m 58s	remaining: 24.4s
1319:	learn: 0.3570701	total: 2m 58s	remaining: 24.3s
1320:	learn: 0.3570383	total: 2m 58s	remaining: 24.2s
1321:	learn: 0.3570009	total: 2m 58s	remaining: 24s
1322:	learn: 0.3569629	total: 2m 58s	remaining: 23.9s
1323:	learn: 0.3569095	total: 2m 58s	remaining: 23.8s
1324:	learn: 0.3568731	total: 2m 58s	remaining: 23.6s
1325:	learn: 0.3568471	total: 2m 58s	remaining: 23.5s
1326:	learn: 0.3568267	total: 2m 59s	remaining: 23.3s
1327:	learn: 0.3567946	total: 2m 59s	remaining: 23.2s
1328:	learn: 0.3567518	total: 2m 59s	remaining: 23.1s
1329:	learn: 0.3567196	total: 2m 59s	remaining: 22.9s
1330:	learn: 0.3566762	total: 2m 59s	remaining: 22.8s
1331:	learn: 0.3566335	total: 2m 59s	remaining: 22.7s
1332:	learn: 0.3566179	total: 2m 59s	remaining: 22.5s
1333:	learn: 0.3565709	total: 2m 59s	remaining: 22.4s
1334:	learn: 0.3565256	total: 3m	remaining: 22.3s
1335:	learn: 0.3564885	total: 3m	remaining: 22.1s
1336:	learn: 0.3564438	total: 3m	remaining: 22s
1337:	learn: 0.3564107	total: 3m	remaining: 21.8s
1338:	learn: 0.3563772	total: 3m	remaining: 21.7s
1339:	learn: 0.3563264	total: 3m	remaining: 21.6s
1340:	learn: 0.3562780	total: 3m	remaining: 21.4s
1341:	learn: 0.3562339	total: 3m	remaining: 21.3s
1342:	learn: 0.3561935	total: 3m	remaining: 21.2s
1343:	learn: 0.3561614	total: 3m 1s	remaining: 21s
1344:	learn: 0.3561201	total: 3m 1s	remaining: 20.9s
1345:	learn: 0.3560818	total: 3m 1s	remaining: 20.7s
1346:	learn: 0.3560442	total: 3m 1s	remaining: 20.6s
1347:	learn: 0.3559946	total: 3m 1s	remaining: 20.5s
1348:	learn: 0.3559442	total: 3m 1s	remaining: 20.3s
1349:	learn: 0.3559051	total: 3m 1s	remaining: 20.2s
1350:	learn: 0.3558697	total: 3m 1s	remaining: 20.1s
1351:	learn: 0.3558418	total: 3m 2s	remaining: 19.9s
1352:	learn: 0.3558117	total: 3m 2s	remaining: 19.8s
1353:	learn: 0.3557577	total: 3m 2s	remaining: 19.7s
1354:	learn: 0.3557349	total: 3m 2s	remaining: 19.5s
1355:	learn: 0.3556909	total: 3m 2s	remaining: 19.4s
1356:	learn: 0.3556587	total: 3m 2s	remaining: 19.2s
1357:	learn: 0.3556108	total: 3m 2s	remaining: 19.1s
1358:	learn: 0.3555700	total: 3m 2s	remaining: 19s
1359:	learn: 0.3555445	total: 3m 3s	remaining: 18.8s
1360:	learn: 0.3554970	total: 3m 3s	remaining: 18.7s
1361:	learn: 0.3554609	total: 3m 3s	remaining: 18.6s
1362:	learn: 0.3554221	total: 3m 3s	remaining: 18.4s
1363:	learn: 0.3553790	total: 3m 3s	remaining: 18.3s
1364:	learn: 0.3553405	total: 3m 3s	remaining: 18.2s
1365:	learn: 0.3553133	total: 3m 3s	remaining: 18s
1366:	learn: 0.3552814	total: 3m 3s	remaining: 17.9s
1367:	learn: 0.3552388	total: 3m 3s	remaining: 17.8s
1368:	learn: 0.3552004	total: 3m 4s	remaining: 17.6s
1369:	learn: 0.3551670	total: 3m 4s	remaining: 17.5s
1370:	learn: 0.3551320	total: 3m 4s	remaining: 17.3s
1371:	learn: 0.3550833	total: 3m 4s	remaining: 17.2s
1372:	learn: 0.3550433	total: 3m 4s	remaining: 17.1s
1373:	learn: 0.3549928	total: 3m 4s	remaining: 16.9s
1374:	learn: 0.3549740	total: 3m 4s	remaining: 16.8s
1375:	learn: 0.3549231	total: 3m 4s	remaining: 16.7s
1376:	learn: 0.3548788	total: 3m 5s	remaining: 16.5s
1377:	learn: 0.3548351	total: 3m 5s	remaining: 16.4s
1378:	learn: 0.3547882	total: 3m 5s	remaining: 16.3s
1379:	learn: 0.3547425	total: 3m 5s	remaining: 16.1s
1380:	learn: 0.3547022	total: 3m 5s	remaining: 16s
1381:	learn: 0.3546612	total: 3m 5s	remaining: 15.9s
1382:	learn: 0.3546269	total: 3m 5s	remaining: 15.7s
1383:	learn: 0.3545878	total: 3m 5s	remaining: 15.6s
1384:	learn: 0.3545405	total: 3m 6s	remaining: 15.5s
1385:	learn: 0.3544910	total: 3m 6s	remaining: 15.3s
1386:	learn: 0.3544484	total: 3m 6s	remaining: 15.2s
1387:	learn: 0.3544045	total: 3m 6s	remaining: 15s
1388:	learn: 0.3543711	total: 3m 6s	remaining: 14.9s
1389:	learn: 0.3543409	total: 3m 6s	remaining: 14.8s
1390:	learn: 0.3542929	total: 3m 6s	remaining: 14.6s
1391:	learn: 0.3542580	total: 3m 6s	remaining: 14.5s
1392:	learn: 0.3542312	total: 3m 7s	remaining: 14.4s
1393:	learn: 0.3541853	total: 3m 7s	remaining: 14.2s
1394:	learn: 0.3541372	total: 3m 7s	remaining: 14.1s
1395:	learn: 0.3540900	total: 3m 7s	remaining: 14s
1396:	learn: 0.3540294	total: 3m 7s	remaining: 13.8s
1397:	learn: 0.3539862	total: 3m 7s	remaining: 13.7s
1398:	learn: 0.3539392	total: 3m 7s	remaining: 13.6s
1399:	learn: 0.3539022	total: 3m 7s	remaining: 13.4s
1400:	learn: 0.3538627	total: 3m 8s	remaining: 13.3s
1401:	learn: 0.3538294	total: 3m 8s	remaining: 13.2s
1402:	learn: 0.3537836	total: 3m 8s	remaining: 13s
1403:	learn: 0.3537431	total: 3m 8s	remaining: 12.9s
1404:	learn: 0.3537081	total: 3m 8s	remaining: 12.7s
1405:	learn: 0.3536673	total: 3m 8s	remaining: 12.6s
1406:	learn: 0.3536350	total: 3m 8s	remaining: 12.5s
1407:	learn: 0.3536109	total: 3m 8s	remaining: 12.3s
1408:	learn: 0.3535739	total: 3m 8s	remaining: 12.2s
1409:	learn: 0.3535470	total: 3m 9s	remaining: 12.1s
1410:	learn: 0.3535019	total: 3m 9s	remaining: 11.9s
1411:	learn: 0.3534622	total: 3m 9s	remaining: 11.8s
1412:	learn: 0.3534104	total: 3m 9s	remaining: 11.7s
1413:	learn: 0.3533745	total: 3m 9s	remaining: 11.5s
1414:	learn: 0.3533578	total: 3m 9s	remaining: 11.4s
1415:	learn: 0.3533218	total: 3m 9s	remaining: 11.3s
1416:	learn: 0.3532873	total: 3m 9s	remaining: 11.1s
1417:	learn: 0.3532383	total: 3m 10s	remaining: 11s
1418:	learn: 0.3532012	total: 3m 10s	remaining: 10.9s
1419:	learn: 0.3531614	total: 3m 10s	remaining: 10.7s
1420:	learn: 0.3531121	total: 3m 10s	remaining: 10.6s
1421:	learn: 0.3530779	total: 3m 10s	remaining: 10.5s
1422:	learn: 0.3530328	total: 3m 10s	remaining: 10.3s
1423:	learn: 0.3530020	total: 3m 10s	remaining: 10.2s
1424:	learn: 0.3529605	total: 3m 10s	remaining: 10.1s
1425:	learn: 0.3529233	total: 3m 11s	remaining: 9.92s
1426:	learn: 0.3528767	total: 3m 11s	remaining: 9.78s
1427:	learn: 0.3528452	total: 3m 11s	remaining: 9.65s
1428:	learn: 0.3527979	total: 3m 11s	remaining: 9.52s
1429:	learn: 0.3527721	total: 3m 11s	remaining: 9.38s
1430:	learn: 0.3527320	total: 3m 11s	remaining: 9.25s
1431:	learn: 0.3526864	total: 3m 11s	remaining: 9.12s
1432:	learn: 0.3526484	total: 3m 12s	remaining: 8.98s
1433:	learn: 0.3526065	total: 3m 12s	remaining: 8.85s
1434:	learn: 0.3525660	total: 3m 12s	remaining: 8.71s
1435:	learn: 0.3525240	total: 3m 12s	remaining: 8.58s
1436:	learn: 0.3524813	total: 3m 12s	remaining: 8.45s
1437:	learn: 0.3524360	total: 3m 12s	remaining: 8.31s
1438:	learn: 0.3523884	total: 3m 12s	remaining: 8.18s
1439:	learn: 0.3523572	total: 3m 13s	remaining: 8.04s
1440:	learn: 0.3523159	total: 3m 13s	remaining: 7.91s
1441:	learn: 0.3522822	total: 3m 13s	remaining: 7.78s
1442:	learn: 0.3522614	total: 3m 13s	remaining: 7.64s
1443:	learn: 0.3522132	total: 3m 13s	remaining: 7.51s
1444:	learn: 0.3521782	total: 3m 13s	remaining: 7.37s
1445:	learn: 0.3521373	total: 3m 13s	remaining: 7.24s
1446:	learn: 0.3520969	total: 3m 13s	remaining: 7.11s
1447:	learn: 0.3520609	total: 3m 14s	remaining: 6.97s
1448:	learn: 0.3520308	total: 3m 14s	remaining: 6.83s
1449:	learn: 0.3519791	total: 3m 14s	remaining: 6.7s
1450:	learn: 0.3519416	total: 3m 14s	remaining: 6.57s
1451:	learn: 0.3518955	total: 3m 14s	remaining: 6.43s
1452:	learn: 0.3518484	total: 3m 14s	remaining: 6.3s
1453:	learn: 0.3518085	total: 3m 14s	remaining: 6.16s
1454:	learn: 0.3517620	total: 3m 14s	remaining: 6.03s
1455:	learn: 0.3517228	total: 3m 15s	remaining: 5.89s
1456:	learn: 0.3516976	total: 3m 15s	remaining: 5.76s
1457:	learn: 0.3516696	total: 3m 15s	remaining: 5.63s
1458:	learn: 0.3516329	total: 3m 15s	remaining: 5.49s
1459:	learn: 0.3515976	total: 3m 15s	remaining: 5.36s
1460:	learn: 0.3515594	total: 3m 15s	remaining: 5.22s
1461:	learn: 0.3515210	total: 3m 15s	remaining: 5.09s
1462:	learn: 0.3514760	total: 3m 15s	remaining: 4.95s
1463:	learn: 0.3514306	total: 3m 16s	remaining: 4.82s
1464:	learn: 0.3513896	total: 3m 16s	remaining: 4.69s
1465:	learn: 0.3513433	total: 3m 16s	remaining: 4.55s
1466:	learn: 0.3513232	total: 3m 16s	remaining: 4.42s
1467:	learn: 0.3512763	total: 3m 16s	remaining: 4.28s
1468:	learn: 0.3512358	total: 3m 16s	remaining: 4.15s
1469:	learn: 0.3512027	total: 3m 16s	remaining: 4.01s
1470:	learn: 0.3511642	total: 3m 16s	remaining: 3.88s
1471:	learn: 0.3511391	total: 3m 16s	remaining: 3.75s
1472:	learn: 0.3510922	total: 3m 17s	remaining: 3.61s
1473:	learn: 0.3510538	total: 3m 17s	remaining: 3.48s
1474:	learn: 0.3510077	total: 3m 17s	remaining: 3.34s
1475:	learn: 0.3509752	total: 3m 17s	remaining: 3.21s
1476:	learn: 0.3509247	total: 3m 17s	remaining: 3.08s
1477:	learn: 0.3508828	total: 3m 17s	remaining: 2.94s
1478:	learn: 0.3508462	total: 3m 17s	remaining: 2.81s
1479:	learn: 0.3507991	total: 3m 17s	remaining: 2.67s
1480:	learn: 0.3507645	total: 3m 18s	remaining: 2.54s
1481:	learn: 0.3507335	total: 3m 18s	remaining: 2.41s
1482:	learn: 0.3507037	total: 3m 18s	remaining: 2.27s
1483:	learn: 0.3506769	total: 3m 18s	remaining: 2.14s
1484:	learn: 0.3506340	total: 3m 18s	remaining: 2s
1485:	learn: 0.3506071	total: 3m 18s	remaining: 1.87s
1486:	learn: 0.3505759	total: 3m 18s	remaining: 1.74s
1487:	learn: 0.3505358	total: 3m 18s	remaining: 1.6s
1488:	learn: 0.3505131	total: 3m 19s	remaining: 1.47s
1489:	learn: 0.3504821	total: 3m 19s	remaining: 1.34s
1490:	learn: 0.3504375	total: 3m 19s	remaining: 1.2s
1491:	learn: 0.3503992	total: 3m 19s	remaining: 1.07s
1492:	learn: 0.3503648	total: 3m 19s	remaining: 935ms
1493:	learn: 0.3503229	total: 3m 19s	remaining: 802ms
1494:	learn: 0.3502762	total: 3m 19s	remaining: 668ms
1495:	learn: 0.3502411	total: 3m 19s	remaining: 534ms
1496:	learn: 0.3502052	total: 3m 20s	remaining: 401ms
1497:	learn: 0.3501591	total: 3m 20s	remaining: 267ms
1498:	learn: 0.3501239	total: 3m 20s	remaining: 134ms
1499:	learn: 0.3500766	total: 3m 20s	remaining: 0us
Wall time: 3min 23s
Out[87]:
<catboost.core.CatBoostClassifier at 0x232cb9997c0>

Compute FPR + FNR score

In [88]:
catgbc_y_pred = catgbc.predict(catgbc_X_valid)
valid_score = criterion(catgbc_y_pred, catgbc_y_valid)
print('FPR + FNR = {}'.format(valid_score))
FPR + FNR = 0.3870198716688109

To further inspect the performance

In [89]:
# to further inspect the performance:
CM = confusion_matrix(catgbc_y_valid, catgbc_y_pred)
TN, TP = CM[0, 0], CM[1, 1]
FP, FN = CM[0, 1], CM[1, 0]
print('Confusion Matrix: \n {}'.format(CM))
print('Accuracy: {}'.format((TP + TN) / (TP + TN + FP + FN)))  
print('False Positive Rate: {}'.format(FP / (FP + TN)))  
print('False Negative Rate: {}'.format(FN / (FN + TP)))
print('FPR + FNR = {}'.format(FP / (FP + TN) + FN / (FN + TP)))
plt.figure(figsize=(6,4))
plt.grid()
gb_y_prob = catgbc.predict_proba(catgbc_X_valid)[:, 1]
fpr, tpr, thresholds = roc_curve(catgbc_y_valid, gb_y_prob, pos_label=1)
idx = np.argmin(fpr + (1-tpr))
plt.plot(fpr, 1-tpr, label='RF')
plt.plot(fpr[idx], (1-tpr)[idx], '+', color='k')
plt.legend(loc='best')
plt.xlabel('FPR')
plt.ylabel('FNR')
plt.show()
Confusion Matrix: 
 [[24385  5324]
 [ 6180 23558]]
Accuracy: 0.8064830857738826
False Positive Rate: 0.17920495472752365
False Negative Rate: 0.20781491694128723
FPR + FNR = 0.3870198716688109

Result discussion

The $FPR$ and $FNR$ rate is the lower for this notebook: 0.38. CatBoost provides better performance than XGBoost and LightGBM on the binary classification task.

11/ Voting Classifier

The voting classifier aggregates the predicted class or predicted probability on basis of hard voting or soft voting. So if the goal is to feed a variety of base models to the voting classifier it makes sure to resolve the error by any model.

image.png

Source : https://towardsdatascience.com/use-voting-classifier-to-improve-the-performance-of-your-ml-model-805345f9de0e

To implement the Voting Classifier I use the VotingClassifier lib from scikit-learn.

Moreover I use the voting: hard hyperparameter simply because majority vote strategy improve performance. That means, the class most predicted by the 3 boosting models is the final class predicted by the voting classifier model.

I choose two models encompassing the 3 best classifiers which are XGBoost, LightGBM, CatBoost. For each classifier, I use the best hyperparameters that I defined when I implemented these classifiers independently.

1st Approach : Majority Vote (Hard Voting)

Split whole dataset

In [93]:
voting_gbc_X_train, voting_gbc_X_valid, voting_gbc_y_train, voting_gbc_y_valid = train_test_split(X_dataframe, y, test_size=0.2, random_state=57)

Define hyperparameters and fit the model

In [94]:
%%time

estimators = [
    ('xgbc', XGBClassifier(booster='gbtree', learning_rate=0.3, 
                     max_depth=6, n_estimators=103, 
                     colsample_bynode=1, colsample_bytree=1,
                     subsample=1, gamma=0, 
                     objective='binary:logistic', random_state=57)),
    
    ('lgbc', LGBMClassifier(objective= 'binary', 
                            n_estimators = 2000, random_state=57)),
    
    ('catgbc', CatBoostClassifier(eval_metric= 'Logloss', iterations= 1500, 
                                  learning_rate= 0.1, subsample= 0.8, random_state=57))
]

voting_clf = VotingClassifier(estimators=estimators, voting='hard')

voting_clf.fit(voting_gbc_X_train, voting_gbc_y_train)
0:	learn: 0.6649451	total: 141ms	remaining: 3m 30s
1:	learn: 0.6408262	total: 270ms	remaining: 3m 21s
2:	learn: 0.6222621	total: 408ms	remaining: 3m 23s
3:	learn: 0.6056707	total: 571ms	remaining: 3m 33s
4:	learn: 0.5923237	total: 723ms	remaining: 3m 36s
5:	learn: 0.5796092	total: 879ms	remaining: 3m 38s
6:	learn: 0.5701940	total: 1.03s	remaining: 3m 40s
7:	learn: 0.5614521	total: 1.19s	remaining: 3m 42s
8:	learn: 0.5539308	total: 1.36s	remaining: 3m 44s
9:	learn: 0.5471526	total: 1.51s	remaining: 3m 45s
10:	learn: 0.5414993	total: 1.68s	remaining: 3m 47s
11:	learn: 0.5359282	total: 1.86s	remaining: 3m 50s
12:	learn: 0.5309897	total: 2.03s	remaining: 3m 52s
13:	learn: 0.5271958	total: 2.16s	remaining: 3m 49s
14:	learn: 0.5229875	total: 2.32s	remaining: 3m 49s
15:	learn: 0.5202042	total: 2.48s	remaining: 3m 49s
16:	learn: 0.5164225	total: 2.65s	remaining: 3m 51s
17:	learn: 0.5139612	total: 2.78s	remaining: 3m 49s
18:	learn: 0.5111718	total: 2.95s	remaining: 3m 49s
19:	learn: 0.5085238	total: 3.13s	remaining: 3m 51s
20:	learn: 0.5058842	total: 3.33s	remaining: 3m 54s
21:	learn: 0.5038090	total: 3.5s	remaining: 3m 55s
22:	learn: 0.5019270	total: 3.66s	remaining: 3m 54s
23:	learn: 0.5003162	total: 3.83s	remaining: 3m 55s
24:	learn: 0.4985737	total: 4.01s	remaining: 3m 56s
25:	learn: 0.4972461	total: 4.14s	remaining: 3m 54s
26:	learn: 0.4958450	total: 4.3s	remaining: 3m 54s
27:	learn: 0.4942530	total: 4.49s	remaining: 3m 56s
28:	learn: 0.4931138	total: 4.64s	remaining: 3m 55s
29:	learn: 0.4916998	total: 4.85s	remaining: 3m 57s
30:	learn: 0.4899690	total: 5.03s	remaining: 3m 58s
31:	learn: 0.4889622	total: 5.17s	remaining: 3m 57s
32:	learn: 0.4877336	total: 5.38s	remaining: 3m 59s
33:	learn: 0.4860636	total: 5.57s	remaining: 4m
34:	learn: 0.4851071	total: 5.72s	remaining: 3m 59s
35:	learn: 0.4843042	total: 5.9s	remaining: 3m 59s
36:	learn: 0.4833310	total: 6.07s	remaining: 4m
37:	learn: 0.4822906	total: 6.26s	remaining: 4m
38:	learn: 0.4807681	total: 6.43s	remaining: 4m
39:	learn: 0.4802608	total: 6.59s	remaining: 4m
40:	learn: 0.4796616	total: 6.73s	remaining: 3m 59s
41:	learn: 0.4787774	total: 6.89s	remaining: 3m 59s
42:	learn: 0.4780625	total: 7.04s	remaining: 3m 58s
43:	learn: 0.4762518	total: 7.18s	remaining: 3m 57s
44:	learn: 0.4748372	total: 7.32s	remaining: 3m 56s
45:	learn: 0.4740745	total: 7.47s	remaining: 3m 56s
46:	learn: 0.4724770	total: 7.64s	remaining: 3m 56s
47:	learn: 0.4717461	total: 7.81s	remaining: 3m 56s
48:	learn: 0.4701812	total: 7.95s	remaining: 3m 55s
49:	learn: 0.4695892	total: 8.11s	remaining: 3m 55s
50:	learn: 0.4688520	total: 8.23s	remaining: 3m 53s
51:	learn: 0.4680634	total: 8.4s	remaining: 3m 53s
52:	learn: 0.4670235	total: 8.55s	remaining: 3m 53s
53:	learn: 0.4663750	total: 8.72s	remaining: 3m 53s
54:	learn: 0.4659496	total: 8.84s	remaining: 3m 52s
55:	learn: 0.4648873	total: 9s	remaining: 3m 52s
56:	learn: 0.4640366	total: 9.16s	remaining: 3m 51s
57:	learn: 0.4635239	total: 9.29s	remaining: 3m 51s
58:	learn: 0.4628211	total: 9.45s	remaining: 3m 50s
59:	learn: 0.4624765	total: 9.63s	remaining: 3m 51s
60:	learn: 0.4618908	total: 9.78s	remaining: 3m 50s
61:	learn: 0.4613503	total: 9.9s	remaining: 3m 49s
62:	learn: 0.4609151	total: 10.1s	remaining: 3m 49s
63:	learn: 0.4605220	total: 10.2s	remaining: 3m 49s
64:	learn: 0.4602499	total: 10.4s	remaining: 3m 48s
65:	learn: 0.4597629	total: 10.5s	remaining: 3m 47s
66:	learn: 0.4592989	total: 10.6s	remaining: 3m 47s
67:	learn: 0.4589277	total: 10.8s	remaining: 3m 47s
68:	learn: 0.4585763	total: 10.9s	remaining: 3m 46s
69:	learn: 0.4581531	total: 11s	remaining: 3m 45s
70:	learn: 0.4577356	total: 11.2s	remaining: 3m 45s
71:	learn: 0.4574710	total: 11.3s	remaining: 3m 44s
72:	learn: 0.4571883	total: 11.4s	remaining: 3m 43s
73:	learn: 0.4568751	total: 11.6s	remaining: 3m 42s
74:	learn: 0.4564676	total: 11.7s	remaining: 3m 42s
75:	learn: 0.4562251	total: 11.8s	remaining: 3m 41s
76:	learn: 0.4553923	total: 11.9s	remaining: 3m 40s
77:	learn: 0.4551094	total: 12.1s	remaining: 3m 39s
78:	learn: 0.4547412	total: 12.2s	remaining: 3m 39s
79:	learn: 0.4543632	total: 12.3s	remaining: 3m 38s
80:	learn: 0.4539733	total: 12.5s	remaining: 3m 38s
81:	learn: 0.4530339	total: 12.6s	remaining: 3m 37s
82:	learn: 0.4526909	total: 12.7s	remaining: 3m 37s
83:	learn: 0.4520350	total: 12.9s	remaining: 3m 37s
84:	learn: 0.4517695	total: 13s	remaining: 3m 36s
85:	learn: 0.4515423	total: 13.1s	remaining: 3m 36s
86:	learn: 0.4510530	total: 13.3s	remaining: 3m 35s
87:	learn: 0.4506511	total: 13.4s	remaining: 3m 35s
88:	learn: 0.4497740	total: 13.6s	remaining: 3m 35s
89:	learn: 0.4494407	total: 13.7s	remaining: 3m 34s
90:	learn: 0.4491871	total: 13.8s	remaining: 3m 34s
91:	learn: 0.4488822	total: 14s	remaining: 3m 33s
92:	learn: 0.4484079	total: 14.1s	remaining: 3m 33s
93:	learn: 0.4480556	total: 14.2s	remaining: 3m 32s
94:	learn: 0.4476972	total: 14.4s	remaining: 3m 32s
95:	learn: 0.4473968	total: 14.5s	remaining: 3m 31s
96:	learn: 0.4471603	total: 14.6s	remaining: 3m 31s
97:	learn: 0.4465976	total: 14.8s	remaining: 3m 31s
98:	learn: 0.4463558	total: 14.9s	remaining: 3m 30s
99:	learn: 0.4461760	total: 15s	remaining: 3m 30s
100:	learn: 0.4459214	total: 15.2s	remaining: 3m 30s
101:	learn: 0.4454492	total: 15.4s	remaining: 3m 30s
102:	learn: 0.4448885	total: 15.5s	remaining: 3m 30s
103:	learn: 0.4446436	total: 15.7s	remaining: 3m 30s
104:	learn: 0.4441487	total: 15.8s	remaining: 3m 30s
105:	learn: 0.4439482	total: 16s	remaining: 3m 29s
106:	learn: 0.4436272	total: 16.1s	remaining: 3m 29s
107:	learn: 0.4434740	total: 16.2s	remaining: 3m 29s
108:	learn: 0.4429574	total: 16.4s	remaining: 3m 29s
109:	learn: 0.4425878	total: 16.5s	remaining: 3m 28s
110:	learn: 0.4423793	total: 16.7s	remaining: 3m 28s
111:	learn: 0.4418545	total: 16.8s	remaining: 3m 27s
112:	learn: 0.4417046	total: 16.9s	remaining: 3m 27s
113:	learn: 0.4414700	total: 17.1s	remaining: 3m 27s
114:	learn: 0.4411934	total: 17.2s	remaining: 3m 27s
115:	learn: 0.4410218	total: 17.3s	remaining: 3m 26s
116:	learn: 0.4407950	total: 17.5s	remaining: 3m 26s
117:	learn: 0.4404482	total: 17.6s	remaining: 3m 26s
118:	learn: 0.4400505	total: 17.8s	remaining: 3m 26s
119:	learn: 0.4395959	total: 17.9s	remaining: 3m 26s
120:	learn: 0.4393376	total: 18.1s	remaining: 3m 25s
121:	learn: 0.4391237	total: 18.2s	remaining: 3m 25s
122:	learn: 0.4388553	total: 18.3s	remaining: 3m 25s
123:	learn: 0.4386726	total: 18.5s	remaining: 3m 24s
124:	learn: 0.4380448	total: 18.6s	remaining: 3m 24s
125:	learn: 0.4378355	total: 18.7s	remaining: 3m 24s
126:	learn: 0.4376385	total: 18.8s	remaining: 3m 23s
127:	learn: 0.4374471	total: 19s	remaining: 3m 23s
128:	learn: 0.4370333	total: 19.1s	remaining: 3m 23s
129:	learn: 0.4368561	total: 19.2s	remaining: 3m 22s
130:	learn: 0.4364377	total: 19.3s	remaining: 3m 22s
131:	learn: 0.4361916	total: 19.5s	remaining: 3m 21s
132:	learn: 0.4358750	total: 19.6s	remaining: 3m 21s
133:	learn: 0.4356908	total: 19.7s	remaining: 3m 21s
134:	learn: 0.4350230	total: 19.9s	remaining: 3m 21s
135:	learn: 0.4347985	total: 20s	remaining: 3m 20s
136:	learn: 0.4343561	total: 20.1s	remaining: 3m 20s
137:	learn: 0.4340909	total: 20.3s	remaining: 3m 20s
138:	learn: 0.4339247	total: 20.4s	remaining: 3m 20s
139:	learn: 0.4337032	total: 20.6s	remaining: 3m 19s
140:	learn: 0.4335847	total: 20.7s	remaining: 3m 19s
141:	learn: 0.4334104	total: 20.8s	remaining: 3m 19s
142:	learn: 0.4332238	total: 21s	remaining: 3m 19s
143:	learn: 0.4328432	total: 21.1s	remaining: 3m 18s
144:	learn: 0.4327017	total: 21.3s	remaining: 3m 18s
145:	learn: 0.4323702	total: 21.4s	remaining: 3m 18s
146:	learn: 0.4320510	total: 21.6s	remaining: 3m 18s
147:	learn: 0.4317127	total: 21.7s	remaining: 3m 18s
148:	learn: 0.4315326	total: 21.8s	remaining: 3m 18s
149:	learn: 0.4312614	total: 22s	remaining: 3m 17s
150:	learn: 0.4309846	total: 22.1s	remaining: 3m 17s
151:	learn: 0.4308043	total: 22.3s	remaining: 3m 17s
152:	learn: 0.4305173	total: 22.4s	remaining: 3m 17s
153:	learn: 0.4301383	total: 22.6s	remaining: 3m 17s
154:	learn: 0.4295965	total: 22.7s	remaining: 3m 16s
155:	learn: 0.4293825	total: 22.8s	remaining: 3m 16s
156:	learn: 0.4291595	total: 22.9s	remaining: 3m 16s
157:	learn: 0.4289482	total: 23.1s	remaining: 3m 15s
158:	learn: 0.4288062	total: 23.2s	remaining: 3m 15s
159:	learn: 0.4286485	total: 23.3s	remaining: 3m 15s
160:	learn: 0.4284836	total: 23.5s	remaining: 3m 15s
161:	learn: 0.4282930	total: 23.6s	remaining: 3m 14s
162:	learn: 0.4281037	total: 23.7s	remaining: 3m 14s
163:	learn: 0.4279654	total: 23.8s	remaining: 3m 14s
164:	learn: 0.4277494	total: 24s	remaining: 3m 14s
165:	learn: 0.4275762	total: 24.1s	remaining: 3m 13s
166:	learn: 0.4273726	total: 24.2s	remaining: 3m 13s
167:	learn: 0.4271790	total: 24.4s	remaining: 3m 13s
168:	learn: 0.4269579	total: 24.5s	remaining: 3m 12s
169:	learn: 0.4267745	total: 24.6s	remaining: 3m 12s
170:	learn: 0.4266005	total: 24.7s	remaining: 3m 12s
171:	learn: 0.4264697	total: 24.8s	remaining: 3m 11s
172:	learn: 0.4263351	total: 25s	remaining: 3m 11s
173:	learn: 0.4261825	total: 25.1s	remaining: 3m 11s
174:	learn: 0.4260034	total: 25.2s	remaining: 3m 10s
175:	learn: 0.4257574	total: 25.3s	remaining: 3m 10s
176:	learn: 0.4256100	total: 25.5s	remaining: 3m 10s
177:	learn: 0.4254636	total: 25.6s	remaining: 3m 9s
178:	learn: 0.4253274	total: 25.7s	remaining: 3m 9s
179:	learn: 0.4250559	total: 25.8s	remaining: 3m 9s
180:	learn: 0.4248062	total: 26s	remaining: 3m 9s
181:	learn: 0.4246204	total: 26.1s	remaining: 3m 9s
182:	learn: 0.4244886	total: 26.3s	remaining: 3m 9s
183:	learn: 0.4243267	total: 26.4s	remaining: 3m 8s
184:	learn: 0.4242176	total: 26.5s	remaining: 3m 8s
185:	learn: 0.4240365	total: 26.7s	remaining: 3m 8s
186:	learn: 0.4239203	total: 26.8s	remaining: 3m 8s
187:	learn: 0.4237206	total: 26.9s	remaining: 3m 7s
188:	learn: 0.4235468	total: 27.1s	remaining: 3m 7s
189:	learn: 0.4233040	total: 27.2s	remaining: 3m 7s
190:	learn: 0.4231949	total: 27.3s	remaining: 3m 7s
191:	learn: 0.4230445	total: 27.4s	remaining: 3m 7s
192:	learn: 0.4228901	total: 27.6s	remaining: 3m 6s
193:	learn: 0.4227842	total: 27.7s	remaining: 3m 6s
194:	learn: 0.4226159	total: 27.8s	remaining: 3m 5s
195:	learn: 0.4224874	total: 27.9s	remaining: 3m 5s
196:	learn: 0.4223700	total: 28s	remaining: 3m 5s
197:	learn: 0.4222499	total: 28.1s	remaining: 3m 5s
198:	learn: 0.4221292	total: 28.3s	remaining: 3m 4s
199:	learn: 0.4219813	total: 28.4s	remaining: 3m 4s
200:	learn: 0.4218506	total: 28.5s	remaining: 3m 4s
201:	learn: 0.4217154	total: 28.6s	remaining: 3m 3s
202:	learn: 0.4216022	total: 28.7s	remaining: 3m 3s
203:	learn: 0.4214807	total: 28.9s	remaining: 3m 3s
204:	learn: 0.4213785	total: 29s	remaining: 3m 3s
205:	learn: 0.4211958	total: 29.1s	remaining: 3m 2s
206:	learn: 0.4211072	total: 29.2s	remaining: 3m 2s
207:	learn: 0.4210155	total: 29.3s	remaining: 3m 2s
208:	learn: 0.4209106	total: 29.4s	remaining: 3m 1s
209:	learn: 0.4207721	total: 29.6s	remaining: 3m 1s
210:	learn: 0.4206339	total: 29.7s	remaining: 3m 1s
211:	learn: 0.4205465	total: 29.8s	remaining: 3m 1s
212:	learn: 0.4204203	total: 29.9s	remaining: 3m
213:	learn: 0.4202733	total: 30s	remaining: 3m
214:	learn: 0.4201414	total: 30.2s	remaining: 3m
215:	learn: 0.4199972	total: 30.3s	remaining: 2m 59s
216:	learn: 0.4198867	total: 30.4s	remaining: 2m 59s
217:	learn: 0.4197592	total: 30.5s	remaining: 2m 59s
218:	learn: 0.4195732	total: 30.6s	remaining: 2m 59s
219:	learn: 0.4193988	total: 30.7s	remaining: 2m 58s
220:	learn: 0.4192712	total: 30.9s	remaining: 2m 58s
221:	learn: 0.4191795	total: 31s	remaining: 2m 58s
222:	learn: 0.4190830	total: 31.1s	remaining: 2m 57s
223:	learn: 0.4189056	total: 31.2s	remaining: 2m 57s
224:	learn: 0.4187942	total: 31.3s	remaining: 2m 57s
225:	learn: 0.4186877	total: 31.5s	remaining: 2m 57s
226:	learn: 0.4185461	total: 31.7s	remaining: 2m 57s
227:	learn: 0.4184258	total: 31.8s	remaining: 2m 57s
228:	learn: 0.4183345	total: 31.9s	remaining: 2m 57s
229:	learn: 0.4181779	total: 32s	remaining: 2m 56s
230:	learn: 0.4180989	total: 32.1s	remaining: 2m 56s
231:	learn: 0.4179560	total: 32.3s	remaining: 2m 56s
232:	learn: 0.4178575	total: 32.4s	remaining: 2m 56s
233:	learn: 0.4176746	total: 32.5s	remaining: 2m 55s
234:	learn: 0.4174302	total: 32.7s	remaining: 2m 55s
235:	learn: 0.4172676	total: 32.8s	remaining: 2m 55s
236:	learn: 0.4171671	total: 32.9s	remaining: 2m 55s
237:	learn: 0.4170300	total: 33s	remaining: 2m 54s
238:	learn: 0.4169264	total: 33.1s	remaining: 2m 54s
239:	learn: 0.4167842	total: 33.2s	remaining: 2m 54s
240:	learn: 0.4166912	total: 33.4s	remaining: 2m 54s
241:	learn: 0.4165760	total: 33.5s	remaining: 2m 54s
242:	learn: 0.4164749	total: 33.6s	remaining: 2m 53s
243:	learn: 0.4163632	total: 33.7s	remaining: 2m 53s
244:	learn: 0.4162762	total: 33.9s	remaining: 2m 53s
245:	learn: 0.4161740	total: 34s	remaining: 2m 53s
246:	learn: 0.4160749	total: 34.1s	remaining: 2m 53s
247:	learn: 0.4159666	total: 34.3s	remaining: 2m 52s
248:	learn: 0.4158628	total: 34.4s	remaining: 2m 52s
249:	learn: 0.4157511	total: 34.5s	remaining: 2m 52s
250:	learn: 0.4156590	total: 34.6s	remaining: 2m 52s
251:	learn: 0.4155760	total: 34.7s	remaining: 2m 51s
252:	learn: 0.4154753	total: 34.8s	remaining: 2m 51s
253:	learn: 0.4153753	total: 34.9s	remaining: 2m 51s
254:	learn: 0.4152081	total: 35.1s	remaining: 2m 51s
255:	learn: 0.4151150	total: 35.2s	remaining: 2m 50s
256:	learn: 0.4150386	total: 35.3s	remaining: 2m 50s
257:	learn: 0.4149674	total: 35.4s	remaining: 2m 50s
258:	learn: 0.4147705	total: 35.5s	remaining: 2m 50s
259:	learn: 0.4146481	total: 35.6s	remaining: 2m 49s
260:	learn: 0.4145137	total: 35.7s	remaining: 2m 49s
261:	learn: 0.4143696	total: 35.9s	remaining: 2m 49s
262:	learn: 0.4142986	total: 36s	remaining: 2m 49s
263:	learn: 0.4142220	total: 36.1s	remaining: 2m 49s
264:	learn: 0.4141148	total: 36.2s	remaining: 2m 48s
265:	learn: 0.4140545	total: 36.4s	remaining: 2m 48s
266:	learn: 0.4139518	total: 36.5s	remaining: 2m 48s
267:	learn: 0.4137133	total: 36.7s	remaining: 2m 48s
268:	learn: 0.4136013	total: 36.8s	remaining: 2m 48s
269:	learn: 0.4135186	total: 37s	remaining: 2m 48s
270:	learn: 0.4134294	total: 37.1s	remaining: 2m 48s
271:	learn: 0.4133328	total: 37.3s	remaining: 2m 48s
272:	learn: 0.4132272	total: 37.4s	remaining: 2m 48s
273:	learn: 0.4130936	total: 37.5s	remaining: 2m 47s
274:	learn: 0.4130175	total: 37.7s	remaining: 2m 47s
275:	learn: 0.4129229	total: 37.8s	remaining: 2m 47s
276:	learn: 0.4128496	total: 37.9s	remaining: 2m 47s
277:	learn: 0.4127433	total: 38.1s	remaining: 2m 47s
278:	learn: 0.4126188	total: 38.2s	remaining: 2m 47s
279:	learn: 0.4125088	total: 38.3s	remaining: 2m 46s
280:	learn: 0.4122998	total: 38.4s	remaining: 2m 46s
281:	learn: 0.4122041	total: 38.5s	remaining: 2m 46s
282:	learn: 0.4121354	total: 38.6s	remaining: 2m 46s
283:	learn: 0.4120381	total: 38.7s	remaining: 2m 45s
284:	learn: 0.4119412	total: 38.9s	remaining: 2m 45s
285:	learn: 0.4118793	total: 39s	remaining: 2m 45s
286:	learn: 0.4118037	total: 39.1s	remaining: 2m 45s
287:	learn: 0.4116976	total: 39.2s	remaining: 2m 44s
288:	learn: 0.4115950	total: 39.3s	remaining: 2m 44s
289:	learn: 0.4114707	total: 39.4s	remaining: 2m 44s
290:	learn: 0.4113039	total: 39.6s	remaining: 2m 44s
291:	learn: 0.4112051	total: 39.7s	remaining: 2m 44s
292:	learn: 0.4111046	total: 39.8s	remaining: 2m 43s
293:	learn: 0.4110371	total: 39.9s	remaining: 2m 43s
294:	learn: 0.4109335	total: 40s	remaining: 2m 43s
295:	learn: 0.4108359	total: 40.1s	remaining: 2m 43s
296:	learn: 0.4106773	total: 40.3s	remaining: 2m 43s
297:	learn: 0.4105960	total: 40.4s	remaining: 2m 42s
298:	learn: 0.4104790	total: 40.5s	remaining: 2m 42s
299:	learn: 0.4103990	total: 40.6s	remaining: 2m 42s
300:	learn: 0.4103135	total: 40.7s	remaining: 2m 42s
301:	learn: 0.4102025	total: 40.8s	remaining: 2m 41s
302:	learn: 0.4101275	total: 40.9s	remaining: 2m 41s
303:	learn: 0.4100594	total: 41s	remaining: 2m 41s
304:	learn: 0.4099722	total: 41.1s	remaining: 2m 41s
305:	learn: 0.4099153	total: 41.2s	remaining: 2m 40s
306:	learn: 0.4098509	total: 41.4s	remaining: 2m 40s
307:	learn: 0.4098077	total: 41.5s	remaining: 2m 40s
308:	learn: 0.4097096	total: 41.6s	remaining: 2m 40s
309:	learn: 0.4096000	total: 41.7s	remaining: 2m 40s
310:	learn: 0.4095139	total: 41.8s	remaining: 2m 39s
311:	learn: 0.4094345	total: 41.9s	remaining: 2m 39s
312:	learn: 0.4093615	total: 42s	remaining: 2m 39s
313:	learn: 0.4092875	total: 42.1s	remaining: 2m 39s
314:	learn: 0.4091700	total: 42.2s	remaining: 2m 38s
315:	learn: 0.4090979	total: 42.3s	remaining: 2m 38s
316:	learn: 0.4090073	total: 42.5s	remaining: 2m 38s
317:	learn: 0.4088995	total: 42.6s	remaining: 2m 38s
318:	learn: 0.4087867	total: 42.7s	remaining: 2m 38s
319:	learn: 0.4087149	total: 42.8s	remaining: 2m 37s
320:	learn: 0.4086473	total: 42.9s	remaining: 2m 37s
321:	learn: 0.4085598	total: 43.1s	remaining: 2m 37s
322:	learn: 0.4084647	total: 43.2s	remaining: 2m 37s
323:	learn: 0.4083842	total: 43.3s	remaining: 2m 37s
324:	learn: 0.4083312	total: 43.4s	remaining: 2m 36s
325:	learn: 0.4082394	total: 43.5s	remaining: 2m 36s
326:	learn: 0.4081457	total: 43.6s	remaining: 2m 36s
327:	learn: 0.4080730	total: 43.7s	remaining: 2m 36s
328:	learn: 0.4079945	total: 43.8s	remaining: 2m 35s
329:	learn: 0.4079157	total: 43.9s	remaining: 2m 35s
330:	learn: 0.4078371	total: 44s	remaining: 2m 35s
331:	learn: 0.4077720	total: 44.1s	remaining: 2m 35s
332:	learn: 0.4076823	total: 44.2s	remaining: 2m 34s
333:	learn: 0.4076090	total: 44.3s	remaining: 2m 34s
334:	learn: 0.4075355	total: 44.4s	remaining: 2m 34s
335:	learn: 0.4074448	total: 44.6s	remaining: 2m 34s
336:	learn: 0.4073741	total: 44.7s	remaining: 2m 34s
337:	learn: 0.4073376	total: 44.8s	remaining: 2m 33s
338:	learn: 0.4072187	total: 44.9s	remaining: 2m 33s
339:	learn: 0.4071512	total: 45s	remaining: 2m 33s
340:	learn: 0.4071029	total: 45.1s	remaining: 2m 33s
341:	learn: 0.4070398	total: 45.2s	remaining: 2m 32s
342:	learn: 0.4069531	total: 45.3s	remaining: 2m 32s
343:	learn: 0.4068749	total: 45.4s	remaining: 2m 32s
344:	learn: 0.4067803	total: 45.5s	remaining: 2m 32s
345:	learn: 0.4067148	total: 45.6s	remaining: 2m 32s
346:	learn: 0.4066003	total: 45.7s	remaining: 2m 31s
347:	learn: 0.4065200	total: 45.9s	remaining: 2m 31s
348:	learn: 0.4064517	total: 46s	remaining: 2m 31s
349:	learn: 0.4063674	total: 46.1s	remaining: 2m 31s
350:	learn: 0.4063104	total: 46.2s	remaining: 2m 31s
351:	learn: 0.4062453	total: 46.3s	remaining: 2m 30s
352:	learn: 0.4061515	total: 46.4s	remaining: 2m 30s
353:	learn: 0.4060939	total: 46.5s	remaining: 2m 30s
354:	learn: 0.4060383	total: 46.6s	remaining: 2m 30s
355:	learn: 0.4059554	total: 46.7s	remaining: 2m 30s
356:	learn: 0.4058925	total: 46.8s	remaining: 2m 29s
357:	learn: 0.4058376	total: 46.9s	remaining: 2m 29s
358:	learn: 0.4057267	total: 47s	remaining: 2m 29s
359:	learn: 0.4056605	total: 47.1s	remaining: 2m 29s
360:	learn: 0.4055708	total: 47.2s	remaining: 2m 28s
361:	learn: 0.4054898	total: 47.3s	remaining: 2m 28s
362:	learn: 0.4054329	total: 47.4s	remaining: 2m 28s
363:	learn: 0.4053468	total: 47.5s	remaining: 2m 28s
364:	learn: 0.4052722	total: 47.6s	remaining: 2m 28s
365:	learn: 0.4051933	total: 47.8s	remaining: 2m 27s
366:	learn: 0.4050899	total: 47.9s	remaining: 2m 27s
367:	learn: 0.4050266	total: 48s	remaining: 2m 27s
368:	learn: 0.4049682	total: 48.1s	remaining: 2m 27s
369:	learn: 0.4048889	total: 48.2s	remaining: 2m 27s
370:	learn: 0.4048277	total: 48.3s	remaining: 2m 26s
371:	learn: 0.4047492	total: 48.4s	remaining: 2m 26s
372:	learn: 0.4046581	total: 48.5s	remaining: 2m 26s
373:	learn: 0.4046075	total: 48.6s	remaining: 2m 26s
374:	learn: 0.4045325	total: 48.7s	remaining: 2m 26s
375:	learn: 0.4044489	total: 48.8s	remaining: 2m 25s
376:	learn: 0.4043388	total: 48.9s	remaining: 2m 25s
377:	learn: 0.4042605	total: 49s	remaining: 2m 25s
378:	learn: 0.4041762	total: 49.1s	remaining: 2m 25s
379:	learn: 0.4040944	total: 49.3s	remaining: 2m 25s
380:	learn: 0.4040446	total: 49.3s	remaining: 2m 24s
381:	learn: 0.4039756	total: 49.5s	remaining: 2m 24s
382:	learn: 0.4039022	total: 49.6s	remaining: 2m 24s
383:	learn: 0.4038412	total: 49.7s	remaining: 2m 24s
384:	learn: 0.4037740	total: 49.8s	remaining: 2m 24s
385:	learn: 0.4036765	total: 49.9s	remaining: 2m 23s
386:	learn: 0.4035938	total: 50s	remaining: 2m 23s
387:	learn: 0.4035348	total: 50.1s	remaining: 2m 23s
388:	learn: 0.4034585	total: 50.2s	remaining: 2m 23s
389:	learn: 0.4033970	total: 50.3s	remaining: 2m 23s
390:	learn: 0.4033244	total: 50.4s	remaining: 2m 23s
391:	learn: 0.4032615	total: 50.5s	remaining: 2m 22s
392:	learn: 0.4031947	total: 50.6s	remaining: 2m 22s
393:	learn: 0.4031231	total: 50.7s	remaining: 2m 22s
394:	learn: 0.4030391	total: 50.8s	remaining: 2m 22s
395:	learn: 0.4029860	total: 51s	remaining: 2m 22s
396:	learn: 0.4029246	total: 51.1s	remaining: 2m 21s
397:	learn: 0.4028537	total: 51.2s	remaining: 2m 21s
398:	learn: 0.4027740	total: 51.3s	remaining: 2m 21s
399:	learn: 0.4027176	total: 51.4s	remaining: 2m 21s
400:	learn: 0.4026674	total: 51.5s	remaining: 2m 21s
401:	learn: 0.4025778	total: 51.6s	remaining: 2m 20s
402:	learn: 0.4025163	total: 51.7s	remaining: 2m 20s
403:	learn: 0.4024578	total: 51.8s	remaining: 2m 20s
404:	learn: 0.4023512	total: 52s	remaining: 2m 20s
405:	learn: 0.4023033	total: 52.1s	remaining: 2m 20s
406:	learn: 0.4022405	total: 52.2s	remaining: 2m 20s
407:	learn: 0.4022003	total: 52.3s	remaining: 2m 20s
408:	learn: 0.4021298	total: 52.5s	remaining: 2m 19s
409:	learn: 0.4020683	total: 52.6s	remaining: 2m 19s
410:	learn: 0.4020015	total: 52.7s	remaining: 2m 19s
411:	learn: 0.4019275	total: 52.9s	remaining: 2m 19s
412:	learn: 0.4018377	total: 53s	remaining: 2m 19s
413:	learn: 0.4017699	total: 53.2s	remaining: 2m 19s
414:	learn: 0.4017360	total: 53.3s	remaining: 2m 19s
415:	learn: 0.4016864	total: 53.4s	remaining: 2m 19s
416:	learn: 0.4016149	total: 53.5s	remaining: 2m 18s
417:	learn: 0.4015742	total: 53.6s	remaining: 2m 18s
418:	learn: 0.4015178	total: 53.7s	remaining: 2m 18s
419:	learn: 0.4014492	total: 53.9s	remaining: 2m 18s
420:	learn: 0.4013875	total: 54s	remaining: 2m 18s
421:	learn: 0.4013270	total: 54.1s	remaining: 2m 18s
422:	learn: 0.4012617	total: 54.2s	remaining: 2m 18s
423:	learn: 0.4011901	total: 54.3s	remaining: 2m 17s
424:	learn: 0.4011073	total: 54.4s	remaining: 2m 17s
425:	learn: 0.4010482	total: 54.6s	remaining: 2m 17s
426:	learn: 0.4009577	total: 54.7s	remaining: 2m 17s
427:	learn: 0.4009061	total: 54.8s	remaining: 2m 17s
428:	learn: 0.4008419	total: 54.9s	remaining: 2m 16s
429:	learn: 0.4007792	total: 55s	remaining: 2m 16s
430:	learn: 0.4006950	total: 55.1s	remaining: 2m 16s
431:	learn: 0.4006193	total: 55.2s	remaining: 2m 16s
432:	learn: 0.4005393	total: 55.3s	remaining: 2m 16s
433:	learn: 0.4004981	total: 55.4s	remaining: 2m 16s
434:	learn: 0.4004301	total: 55.5s	remaining: 2m 15s
435:	learn: 0.4003863	total: 55.6s	remaining: 2m 15s
436:	learn: 0.4003295	total: 55.7s	remaining: 2m 15s
437:	learn: 0.4002646	total: 55.8s	remaining: 2m 15s
438:	learn: 0.4002130	total: 55.9s	remaining: 2m 15s
439:	learn: 0.4001510	total: 56s	remaining: 2m 14s
440:	learn: 0.4000816	total: 56.2s	remaining: 2m 14s
441:	learn: 0.4000083	total: 56.3s	remaining: 2m 14s
442:	learn: 0.3999375	total: 56.3s	remaining: 2m 14s
443:	learn: 0.3998809	total: 56.5s	remaining: 2m 14s
444:	learn: 0.3998259	total: 56.6s	remaining: 2m 14s
445:	learn: 0.3997779	total: 56.7s	remaining: 2m 13s
446:	learn: 0.3997181	total: 56.8s	remaining: 2m 13s
447:	learn: 0.3996629	total: 56.9s	remaining: 2m 13s
448:	learn: 0.3995961	total: 57s	remaining: 2m 13s
449:	learn: 0.3995250	total: 57.1s	remaining: 2m 13s
450:	learn: 0.3994681	total: 57.2s	remaining: 2m 13s
451:	learn: 0.3994387	total: 57.3s	remaining: 2m 12s
452:	learn: 0.3993715	total: 57.4s	remaining: 2m 12s
453:	learn: 0.3992926	total: 57.5s	remaining: 2m 12s
454:	learn: 0.3992255	total: 57.6s	remaining: 2m 12s
455:	learn: 0.3992060	total: 57.7s	remaining: 2m 12s
456:	learn: 0.3991339	total: 57.8s	remaining: 2m 11s
457:	learn: 0.3990697	total: 57.9s	remaining: 2m 11s
458:	learn: 0.3989994	total: 58s	remaining: 2m 11s
459:	learn: 0.3989347	total: 58.2s	remaining: 2m 11s
460:	learn: 0.3988620	total: 58.3s	remaining: 2m 11s
461:	learn: 0.3988170	total: 58.4s	remaining: 2m 11s
462:	learn: 0.3987709	total: 58.4s	remaining: 2m 10s
463:	learn: 0.3987080	total: 58.6s	remaining: 2m 10s
464:	learn: 0.3986518	total: 58.7s	remaining: 2m 10s
465:	learn: 0.3985952	total: 58.8s	remaining: 2m 10s
466:	learn: 0.3985447	total: 58.9s	remaining: 2m 10s
467:	learn: 0.3984873	total: 59s	remaining: 2m 10s
468:	learn: 0.3984199	total: 59.1s	remaining: 2m 9s
469:	learn: 0.3983694	total: 59.2s	remaining: 2m 9s
470:	learn: 0.3982935	total: 59.3s	remaining: 2m 9s
471:	learn: 0.3982542	total: 59.4s	remaining: 2m 9s
472:	learn: 0.3982149	total: 59.5s	remaining: 2m 9s
473:	learn: 0.3981421	total: 59.7s	remaining: 2m 9s
474:	learn: 0.3980793	total: 59.7s	remaining: 2m 8s
475:	learn: 0.3980121	total: 59.8s	remaining: 2m 8s
476:	learn: 0.3979451	total: 60s	remaining: 2m 8s
477:	learn: 0.3979029	total: 1m	remaining: 2m 8s
478:	learn: 0.3978648	total: 1m	remaining: 2m 8s
479:	learn: 0.3978078	total: 1m	remaining: 2m 8s
480:	learn: 0.3977530	total: 1m	remaining: 2m 7s
481:	learn: 0.3976928	total: 1m	remaining: 2m 7s
482:	learn: 0.3976292	total: 1m	remaining: 2m 7s
483:	learn: 0.3975728	total: 1m	remaining: 2m 7s
484:	learn: 0.3975070	total: 1m	remaining: 2m 7s
485:	learn: 0.3974447	total: 1m	remaining: 2m 7s
486:	learn: 0.3974061	total: 1m	remaining: 2m 6s
487:	learn: 0.3973579	total: 1m 1s	remaining: 2m 6s
488:	learn: 0.3973101	total: 1m 1s	remaining: 2m 6s
489:	learn: 0.3972447	total: 1m 1s	remaining: 2m 6s
490:	learn: 0.3972068	total: 1m 1s	remaining: 2m 6s
491:	learn: 0.3971429	total: 1m 1s	remaining: 2m 5s
492:	learn: 0.3970903	total: 1m 1s	remaining: 2m 5s
493:	learn: 0.3970544	total: 1m 1s	remaining: 2m 5s
494:	learn: 0.3969764	total: 1m 1s	remaining: 2m 5s
495:	learn: 0.3969121	total: 1m 1s	remaining: 2m 5s
496:	learn: 0.3968609	total: 1m 2s	remaining: 2m 5s
497:	learn: 0.3967946	total: 1m 2s	remaining: 2m 5s
498:	learn: 0.3967303	total: 1m 2s	remaining: 2m 4s
499:	learn: 0.3966899	total: 1m 2s	remaining: 2m 4s
500:	learn: 0.3966092	total: 1m 2s	remaining: 2m 4s
501:	learn: 0.3965716	total: 1m 2s	remaining: 2m 4s
502:	learn: 0.3965089	total: 1m 2s	remaining: 2m 4s
503:	learn: 0.3964429	total: 1m 2s	remaining: 2m 4s
504:	learn: 0.3963912	total: 1m 2s	remaining: 2m 3s
505:	learn: 0.3963348	total: 1m 3s	remaining: 2m 3s
506:	learn: 0.3962700	total: 1m 3s	remaining: 2m 3s
507:	learn: 0.3962125	total: 1m 3s	remaining: 2m 3s
508:	learn: 0.3961743	total: 1m 3s	remaining: 2m 3s
509:	learn: 0.3961290	total: 1m 3s	remaining: 2m 3s
510:	learn: 0.3960709	total: 1m 3s	remaining: 2m 2s
511:	learn: 0.3960198	total: 1m 3s	remaining: 2m 2s
512:	learn: 0.3959969	total: 1m 3s	remaining: 2m 2s
513:	learn: 0.3959312	total: 1m 3s	remaining: 2m 2s
514:	learn: 0.3958748	total: 1m 3s	remaining: 2m 2s
515:	learn: 0.3958212	total: 1m 4s	remaining: 2m 2s
516:	learn: 0.3957748	total: 1m 4s	remaining: 2m 1s
517:	learn: 0.3956916	total: 1m 4s	remaining: 2m 1s
518:	learn: 0.3956382	total: 1m 4s	remaining: 2m 1s
519:	learn: 0.3955843	total: 1m 4s	remaining: 2m 1s
520:	learn: 0.3955158	total: 1m 4s	remaining: 2m 1s
521:	learn: 0.3954743	total: 1m 4s	remaining: 2m 1s
522:	learn: 0.3954244	total: 1m 4s	remaining: 2m
523:	learn: 0.3953693	total: 1m 4s	remaining: 2m
524:	learn: 0.3952966	total: 1m 4s	remaining: 2m
525:	learn: 0.3952516	total: 1m 5s	remaining: 2m
526:	learn: 0.3951990	total: 1m 5s	remaining: 2m
527:	learn: 0.3951493	total: 1m 5s	remaining: 2m
528:	learn: 0.3950849	total: 1m 5s	remaining: 1m 59s
529:	learn: 0.3950369	total: 1m 5s	remaining: 1m 59s
530:	learn: 0.3949754	total: 1m 5s	remaining: 1m 59s
531:	learn: 0.3949102	total: 1m 5s	remaining: 1m 59s
532:	learn: 0.3948394	total: 1m 5s	remaining: 1m 59s
533:	learn: 0.3947893	total: 1m 5s	remaining: 1m 59s
534:	learn: 0.3947300	total: 1m 6s	remaining: 1m 59s
535:	learn: 0.3946997	total: 1m 6s	remaining: 1m 58s
536:	learn: 0.3946496	total: 1m 6s	remaining: 1m 58s
537:	learn: 0.3945816	total: 1m 6s	remaining: 1m 58s
538:	learn: 0.3945255	total: 1m 6s	remaining: 1m 58s
539:	learn: 0.3944536	total: 1m 6s	remaining: 1m 58s
540:	learn: 0.3943858	total: 1m 6s	remaining: 1m 58s
541:	learn: 0.3943184	total: 1m 6s	remaining: 1m 58s
542:	learn: 0.3941826	total: 1m 6s	remaining: 1m 57s
543:	learn: 0.3941282	total: 1m 6s	remaining: 1m 57s
544:	learn: 0.3940560	total: 1m 7s	remaining: 1m 57s
545:	learn: 0.3939917	total: 1m 7s	remaining: 1m 57s
546:	learn: 0.3939516	total: 1m 7s	remaining: 1m 57s
547:	learn: 0.3939031	total: 1m 7s	remaining: 1m 57s
548:	learn: 0.3938385	total: 1m 7s	remaining: 1m 56s
549:	learn: 0.3937642	total: 1m 7s	remaining: 1m 56s
550:	learn: 0.3937074	total: 1m 7s	remaining: 1m 56s
551:	learn: 0.3936455	total: 1m 7s	remaining: 1m 56s
552:	learn: 0.3935911	total: 1m 8s	remaining: 1m 56s
553:	learn: 0.3935393	total: 1m 8s	remaining: 1m 56s
554:	learn: 0.3934980	total: 1m 8s	remaining: 1m 56s
555:	learn: 0.3934379	total: 1m 8s	remaining: 1m 56s
556:	learn: 0.3933763	total: 1m 8s	remaining: 1m 55s
557:	learn: 0.3933247	total: 1m 8s	remaining: 1m 55s
558:	learn: 0.3932844	total: 1m 8s	remaining: 1m 55s
559:	learn: 0.3932207	total: 1m 8s	remaining: 1m 55s
560:	learn: 0.3931624	total: 1m 8s	remaining: 1m 55s
561:	learn: 0.3931150	total: 1m 9s	remaining: 1m 55s
562:	learn: 0.3930729	total: 1m 9s	remaining: 1m 55s
563:	learn: 0.3930272	total: 1m 9s	remaining: 1m 55s
564:	learn: 0.3929887	total: 1m 9s	remaining: 1m 54s
565:	learn: 0.3929360	total: 1m 9s	remaining: 1m 54s
566:	learn: 0.3928871	total: 1m 9s	remaining: 1m 54s
567:	learn: 0.3928428	total: 1m 9s	remaining: 1m 54s
568:	learn: 0.3927867	total: 1m 9s	remaining: 1m 54s
569:	learn: 0.3927301	total: 1m 9s	remaining: 1m 54s
570:	learn: 0.3926636	total: 1m 10s	remaining: 1m 54s
571:	learn: 0.3926073	total: 1m 10s	remaining: 1m 53s
572:	learn: 0.3925463	total: 1m 10s	remaining: 1m 53s
573:	learn: 0.3924903	total: 1m 10s	remaining: 1m 53s
574:	learn: 0.3924380	total: 1m 10s	remaining: 1m 53s
575:	learn: 0.3923956	total: 1m 10s	remaining: 1m 53s
576:	learn: 0.3923426	total: 1m 10s	remaining: 1m 53s
577:	learn: 0.3922596	total: 1m 10s	remaining: 1m 53s
578:	learn: 0.3922112	total: 1m 10s	remaining: 1m 52s
579:	learn: 0.3921505	total: 1m 11s	remaining: 1m 52s
580:	learn: 0.3920800	total: 1m 11s	remaining: 1m 52s
581:	learn: 0.3920346	total: 1m 11s	remaining: 1m 52s
582:	learn: 0.3919610	total: 1m 11s	remaining: 1m 52s
583:	learn: 0.3919127	total: 1m 11s	remaining: 1m 52s
584:	learn: 0.3918597	total: 1m 11s	remaining: 1m 52s
585:	learn: 0.3918193	total: 1m 11s	remaining: 1m 51s
586:	learn: 0.3917617	total: 1m 11s	remaining: 1m 51s
587:	learn: 0.3917059	total: 1m 11s	remaining: 1m 51s
588:	learn: 0.3916623	total: 1m 12s	remaining: 1m 51s
589:	learn: 0.3916063	total: 1m 12s	remaining: 1m 51s
590:	learn: 0.3915587	total: 1m 12s	remaining: 1m 51s
591:	learn: 0.3915113	total: 1m 12s	remaining: 1m 51s
592:	learn: 0.3914513	total: 1m 12s	remaining: 1m 50s
593:	learn: 0.3914050	total: 1m 12s	remaining: 1m 50s
594:	learn: 0.3913430	total: 1m 12s	remaining: 1m 50s
595:	learn: 0.3912803	total: 1m 12s	remaining: 1m 50s
596:	learn: 0.3912158	total: 1m 13s	remaining: 1m 50s
597:	learn: 0.3911502	total: 1m 13s	remaining: 1m 50s
598:	learn: 0.3911011	total: 1m 13s	remaining: 1m 50s
599:	learn: 0.3910448	total: 1m 13s	remaining: 1m 49s
600:	learn: 0.3909947	total: 1m 13s	remaining: 1m 49s
601:	learn: 0.3909460	total: 1m 13s	remaining: 1m 49s
602:	learn: 0.3909274	total: 1m 13s	remaining: 1m 49s
603:	learn: 0.3908859	total: 1m 13s	remaining: 1m 49s
604:	learn: 0.3908238	total: 1m 13s	remaining: 1m 49s
605:	learn: 0.3907634	total: 1m 13s	remaining: 1m 49s
606:	learn: 0.3907109	total: 1m 14s	remaining: 1m 48s
607:	learn: 0.3906635	total: 1m 14s	remaining: 1m 48s
608:	learn: 0.3906105	total: 1m 14s	remaining: 1m 48s
609:	learn: 0.3905485	total: 1m 14s	remaining: 1m 48s
610:	learn: 0.3905047	total: 1m 14s	remaining: 1m 48s
611:	learn: 0.3904459	total: 1m 14s	remaining: 1m 48s
612:	learn: 0.3903966	total: 1m 14s	remaining: 1m 48s
613:	learn: 0.3903597	total: 1m 14s	remaining: 1m 47s
614:	learn: 0.3903099	total: 1m 14s	remaining: 1m 47s
615:	learn: 0.3902470	total: 1m 14s	remaining: 1m 47s
616:	learn: 0.3902113	total: 1m 15s	remaining: 1m 47s
617:	learn: 0.3901529	total: 1m 15s	remaining: 1m 47s
618:	learn: 0.3901029	total: 1m 15s	remaining: 1m 47s
619:	learn: 0.3900678	total: 1m 15s	remaining: 1m 47s
620:	learn: 0.3900236	total: 1m 15s	remaining: 1m 46s
621:	learn: 0.3899780	total: 1m 15s	remaining: 1m 46s
622:	learn: 0.3899428	total: 1m 15s	remaining: 1m 46s
623:	learn: 0.3899185	total: 1m 15s	remaining: 1m 46s
624:	learn: 0.3898697	total: 1m 15s	remaining: 1m 46s
625:	learn: 0.3898460	total: 1m 15s	remaining: 1m 46s
626:	learn: 0.3897863	total: 1m 16s	remaining: 1m 45s
627:	learn: 0.3897248	total: 1m 16s	remaining: 1m 45s
628:	learn: 0.3896811	total: 1m 16s	remaining: 1m 45s
629:	learn: 0.3896111	total: 1m 16s	remaining: 1m 45s
630:	learn: 0.3895545	total: 1m 16s	remaining: 1m 45s
631:	learn: 0.3894962	total: 1m 16s	remaining: 1m 45s
632:	learn: 0.3894276	total: 1m 16s	remaining: 1m 45s
633:	learn: 0.3893746	total: 1m 16s	remaining: 1m 45s
634:	learn: 0.3893070	total: 1m 17s	remaining: 1m 44s
635:	learn: 0.3892427	total: 1m 17s	remaining: 1m 44s
636:	learn: 0.3891795	total: 1m 17s	remaining: 1m 44s
637:	learn: 0.3891224	total: 1m 17s	remaining: 1m 44s
638:	learn: 0.3890620	total: 1m 17s	remaining: 1m 44s
639:	learn: 0.3890262	total: 1m 17s	remaining: 1m 44s
640:	learn: 0.3889689	total: 1m 17s	remaining: 1m 44s
641:	learn: 0.3889056	total: 1m 17s	remaining: 1m 43s
642:	learn: 0.3888677	total: 1m 17s	remaining: 1m 43s
643:	learn: 0.3888063	total: 1m 17s	remaining: 1m 43s
644:	learn: 0.3887652	total: 1m 18s	remaining: 1m 43s
645:	learn: 0.3887105	total: 1m 18s	remaining: 1m 43s
646:	learn: 0.3886518	total: 1m 18s	remaining: 1m 43s
647:	learn: 0.3886171	total: 1m 18s	remaining: 1m 43s
648:	learn: 0.3885612	total: 1m 18s	remaining: 1m 42s
649:	learn: 0.3884965	total: 1m 18s	remaining: 1m 42s
650:	learn: 0.3884595	total: 1m 18s	remaining: 1m 42s
651:	learn: 0.3883970	total: 1m 18s	remaining: 1m 42s
652:	learn: 0.3883434	total: 1m 18s	remaining: 1m 42s
653:	learn: 0.3883037	total: 1m 19s	remaining: 1m 42s
654:	learn: 0.3882631	total: 1m 19s	remaining: 1m 42s
655:	learn: 0.3882266	total: 1m 19s	remaining: 1m 41s
656:	learn: 0.3881690	total: 1m 19s	remaining: 1m 41s
657:	learn: 0.3881016	total: 1m 19s	remaining: 1m 41s
658:	learn: 0.3880490	total: 1m 19s	remaining: 1m 41s
659:	learn: 0.3879773	total: 1m 19s	remaining: 1m 41s
660:	learn: 0.3879516	total: 1m 19s	remaining: 1m 41s
661:	learn: 0.3878929	total: 1m 19s	remaining: 1m 41s
662:	learn: 0.3878310	total: 1m 19s	remaining: 1m 40s
663:	learn: 0.3877970	total: 1m 20s	remaining: 1m 40s
664:	learn: 0.3877407	total: 1m 20s	remaining: 1m 40s
665:	learn: 0.3876955	total: 1m 20s	remaining: 1m 40s
666:	learn: 0.3876427	total: 1m 20s	remaining: 1m 40s
667:	learn: 0.3875743	total: 1m 20s	remaining: 1m 40s
668:	learn: 0.3875400	total: 1m 20s	remaining: 1m 40s
669:	learn: 0.3874970	total: 1m 20s	remaining: 1m 40s
670:	learn: 0.3874383	total: 1m 20s	remaining: 1m 39s
671:	learn: 0.3873798	total: 1m 20s	remaining: 1m 39s
672:	learn: 0.3873212	total: 1m 21s	remaining: 1m 39s
673:	learn: 0.3872917	total: 1m 21s	remaining: 1m 39s
674:	learn: 0.3872352	total: 1m 21s	remaining: 1m 39s
675:	learn: 0.3871905	total: 1m 21s	remaining: 1m 39s
676:	learn: 0.3871166	total: 1m 21s	remaining: 1m 39s
677:	learn: 0.3870597	total: 1m 21s	remaining: 1m 38s
678:	learn: 0.3869971	total: 1m 21s	remaining: 1m 38s
679:	learn: 0.3869373	total: 1m 21s	remaining: 1m 38s
680:	learn: 0.3868581	total: 1m 21s	remaining: 1m 38s
681:	learn: 0.3868009	total: 1m 21s	remaining: 1m 38s
682:	learn: 0.3867378	total: 1m 22s	remaining: 1m 38s
683:	learn: 0.3866842	total: 1m 22s	remaining: 1m 38s
684:	learn: 0.3866185	total: 1m 22s	remaining: 1m 37s
685:	learn: 0.3865589	total: 1m 22s	remaining: 1m 37s
686:	learn: 0.3865073	total: 1m 22s	remaining: 1m 37s
687:	learn: 0.3864674	total: 1m 22s	remaining: 1m 37s
688:	learn: 0.3864404	total: 1m 22s	remaining: 1m 37s
689:	learn: 0.3863859	total: 1m 22s	remaining: 1m 37s
690:	learn: 0.3863378	total: 1m 22s	remaining: 1m 37s
691:	learn: 0.3863002	total: 1m 23s	remaining: 1m 36s
692:	learn: 0.3862653	total: 1m 23s	remaining: 1m 36s
693:	learn: 0.3862201	total: 1m 23s	remaining: 1m 36s
694:	learn: 0.3861649	total: 1m 23s	remaining: 1m 36s
695:	learn: 0.3861233	total: 1m 23s	remaining: 1m 36s
696:	learn: 0.3860849	total: 1m 23s	remaining: 1m 36s
697:	learn: 0.3860345	total: 1m 23s	remaining: 1m 36s
698:	learn: 0.3859958	total: 1m 23s	remaining: 1m 36s
699:	learn: 0.3859472	total: 1m 24s	remaining: 1m 36s
700:	learn: 0.3859107	total: 1m 24s	remaining: 1m 35s
701:	learn: 0.3858684	total: 1m 24s	remaining: 1m 35s
702:	learn: 0.3858091	total: 1m 24s	remaining: 1m 35s
703:	learn: 0.3857706	total: 1m 24s	remaining: 1m 35s
704:	learn: 0.3857109	total: 1m 24s	remaining: 1m 35s
705:	learn: 0.3856766	total: 1m 24s	remaining: 1m 35s
706:	learn: 0.3856157	total: 1m 24s	remaining: 1m 35s
707:	learn: 0.3855617	total: 1m 25s	remaining: 1m 35s
708:	learn: 0.3855146	total: 1m 25s	remaining: 1m 34s
709:	learn: 0.3854525	total: 1m 25s	remaining: 1m 34s
710:	learn: 0.3853953	total: 1m 25s	remaining: 1m 34s
711:	learn: 0.3853453	total: 1m 25s	remaining: 1m 34s
712:	learn: 0.3852915	total: 1m 25s	remaining: 1m 34s
713:	learn: 0.3852426	total: 1m 25s	remaining: 1m 34s
714:	learn: 0.3851845	total: 1m 25s	remaining: 1m 34s
715:	learn: 0.3851262	total: 1m 25s	remaining: 1m 34s
716:	learn: 0.3850689	total: 1m 26s	remaining: 1m 33s
717:	learn: 0.3850118	total: 1m 26s	remaining: 1m 33s
718:	learn: 0.3849641	total: 1m 26s	remaining: 1m 33s
719:	learn: 0.3849067	total: 1m 26s	remaining: 1m 33s
720:	learn: 0.3848541	total: 1m 26s	remaining: 1m 33s
721:	learn: 0.3848093	total: 1m 26s	remaining: 1m 33s
722:	learn: 0.3847522	total: 1m 26s	remaining: 1m 33s
723:	learn: 0.3846994	total: 1m 26s	remaining: 1m 33s
724:	learn: 0.3846452	total: 1m 26s	remaining: 1m 32s
725:	learn: 0.3845763	total: 1m 27s	remaining: 1m 32s
726:	learn: 0.3845282	total: 1m 27s	remaining: 1m 32s
727:	learn: 0.3844831	total: 1m 27s	remaining: 1m 32s
728:	learn: 0.3844419	total: 1m 27s	remaining: 1m 32s
729:	learn: 0.3843886	total: 1m 27s	remaining: 1m 32s
730:	learn: 0.3843325	total: 1m 27s	remaining: 1m 32s
731:	learn: 0.3842863	total: 1m 27s	remaining: 1m 31s
732:	learn: 0.3842410	total: 1m 27s	remaining: 1m 31s
733:	learn: 0.3842057	total: 1m 27s	remaining: 1m 31s
734:	learn: 0.3841623	total: 1m 27s	remaining: 1m 31s
735:	learn: 0.3841082	total: 1m 28s	remaining: 1m 31s
736:	learn: 0.3840404	total: 1m 28s	remaining: 1m 31s
737:	learn: 0.3839930	total: 1m 28s	remaining: 1m 31s
738:	learn: 0.3839353	total: 1m 28s	remaining: 1m 31s
739:	learn: 0.3839321	total: 1m 28s	remaining: 1m 30s
740:	learn: 0.3838730	total: 1m 28s	remaining: 1m 30s
741:	learn: 0.3838165	total: 1m 28s	remaining: 1m 30s
742:	learn: 0.3837697	total: 1m 28s	remaining: 1m 30s
743:	learn: 0.3837189	total: 1m 28s	remaining: 1m 30s
744:	learn: 0.3836671	total: 1m 29s	remaining: 1m 30s
745:	learn: 0.3836332	total: 1m 29s	remaining: 1m 30s
746:	learn: 0.3835811	total: 1m 29s	remaining: 1m 29s
747:	learn: 0.3835268	total: 1m 29s	remaining: 1m 29s
748:	learn: 0.3834684	total: 1m 29s	remaining: 1m 29s
749:	learn: 0.3834180	total: 1m 29s	remaining: 1m 29s
750:	learn: 0.3833693	total: 1m 29s	remaining: 1m 29s
751:	learn: 0.3833164	total: 1m 29s	remaining: 1m 29s
752:	learn: 0.3832768	total: 1m 29s	remaining: 1m 29s
753:	learn: 0.3832207	total: 1m 29s	remaining: 1m 29s
754:	learn: 0.3831783	total: 1m 30s	remaining: 1m 28s
755:	learn: 0.3831228	total: 1m 30s	remaining: 1m 28s
756:	learn: 0.3830708	total: 1m 30s	remaining: 1m 28s
757:	learn: 0.3830203	total: 1m 30s	remaining: 1m 28s
758:	learn: 0.3829689	total: 1m 30s	remaining: 1m 28s
759:	learn: 0.3829114	total: 1m 30s	remaining: 1m 28s
760:	learn: 0.3828573	total: 1m 30s	remaining: 1m 28s
761:	learn: 0.3828055	total: 1m 30s	remaining: 1m 28s
762:	learn: 0.3827725	total: 1m 30s	remaining: 1m 27s
763:	learn: 0.3827321	total: 1m 31s	remaining: 1m 27s
764:	learn: 0.3826706	total: 1m 31s	remaining: 1m 27s
765:	learn: 0.3826176	total: 1m 31s	remaining: 1m 27s
766:	learn: 0.3825678	total: 1m 31s	remaining: 1m 27s
767:	learn: 0.3825121	total: 1m 31s	remaining: 1m 27s
768:	learn: 0.3824670	total: 1m 31s	remaining: 1m 27s
769:	learn: 0.3824274	total: 1m 31s	remaining: 1m 26s
770:	learn: 0.3823875	total: 1m 31s	remaining: 1m 26s
771:	learn: 0.3823436	total: 1m 31s	remaining: 1m 26s
772:	learn: 0.3822951	total: 1m 32s	remaining: 1m 26s
773:	learn: 0.3822542	total: 1m 32s	remaining: 1m 26s
774:	learn: 0.3822082	total: 1m 32s	remaining: 1m 26s
775:	learn: 0.3821691	total: 1m 32s	remaining: 1m 26s
776:	learn: 0.3821185	total: 1m 32s	remaining: 1m 25s
777:	learn: 0.3820714	total: 1m 32s	remaining: 1m 25s
778:	learn: 0.3820389	total: 1m 32s	remaining: 1m 25s
779:	learn: 0.3819935	total: 1m 32s	remaining: 1m 25s
780:	learn: 0.3819345	total: 1m 32s	remaining: 1m 25s
781:	learn: 0.3818901	total: 1m 32s	remaining: 1m 25s
782:	learn: 0.3818521	total: 1m 33s	remaining: 1m 25s
783:	learn: 0.3818036	total: 1m 33s	remaining: 1m 25s
784:	learn: 0.3817675	total: 1m 33s	remaining: 1m 24s
785:	learn: 0.3817326	total: 1m 33s	remaining: 1m 24s
786:	learn: 0.3816920	total: 1m 33s	remaining: 1m 24s
787:	learn: 0.3816429	total: 1m 33s	remaining: 1m 24s
788:	learn: 0.3815849	total: 1m 33s	remaining: 1m 24s
789:	learn: 0.3815154	total: 1m 33s	remaining: 1m 24s
790:	learn: 0.3815120	total: 1m 33s	remaining: 1m 24s
791:	learn: 0.3814539	total: 1m 33s	remaining: 1m 23s
792:	learn: 0.3814005	total: 1m 33s	remaining: 1m 23s
793:	learn: 0.3813413	total: 1m 34s	remaining: 1m 23s
794:	learn: 0.3812877	total: 1m 34s	remaining: 1m 23s
795:	learn: 0.3812312	total: 1m 34s	remaining: 1m 23s
796:	learn: 0.3811903	total: 1m 34s	remaining: 1m 23s
797:	learn: 0.3811429	total: 1m 34s	remaining: 1m 23s
798:	learn: 0.3810934	total: 1m 34s	remaining: 1m 23s
799:	learn: 0.3810455	total: 1m 34s	remaining: 1m 22s
800:	learn: 0.3809833	total: 1m 34s	remaining: 1m 22s
801:	learn: 0.3809287	total: 1m 34s	remaining: 1m 22s
802:	learn: 0.3808779	total: 1m 35s	remaining: 1m 22s
803:	learn: 0.3808457	total: 1m 35s	remaining: 1m 22s
804:	learn: 0.3807992	total: 1m 35s	remaining: 1m 22s
805:	learn: 0.3807493	total: 1m 35s	remaining: 1m 22s
806:	learn: 0.3807015	total: 1m 35s	remaining: 1m 22s
807:	learn: 0.3806450	total: 1m 35s	remaining: 1m 21s
808:	learn: 0.3805990	total: 1m 35s	remaining: 1m 21s
809:	learn: 0.3805478	total: 1m 35s	remaining: 1m 21s
810:	learn: 0.3804981	total: 1m 35s	remaining: 1m 21s
811:	learn: 0.3804465	total: 1m 36s	remaining: 1m 21s
812:	learn: 0.3803891	total: 1m 36s	remaining: 1m 21s
813:	learn: 0.3803560	total: 1m 36s	remaining: 1m 21s
814:	learn: 0.3803146	total: 1m 36s	remaining: 1m 21s
815:	learn: 0.3802742	total: 1m 36s	remaining: 1m 20s
816:	learn: 0.3802145	total: 1m 36s	remaining: 1m 20s
817:	learn: 0.3801752	total: 1m 36s	remaining: 1m 20s
818:	learn: 0.3801215	total: 1m 36s	remaining: 1m 20s
819:	learn: 0.3800725	total: 1m 36s	remaining: 1m 20s
820:	learn: 0.3800326	total: 1m 37s	remaining: 1m 20s
821:	learn: 0.3799817	total: 1m 37s	remaining: 1m 20s
822:	learn: 0.3799230	total: 1m 37s	remaining: 1m 19s
823:	learn: 0.3798868	total: 1m 37s	remaining: 1m 19s
824:	learn: 0.3798460	total: 1m 37s	remaining: 1m 19s
825:	learn: 0.3797992	total: 1m 37s	remaining: 1m 19s
826:	learn: 0.3797536	total: 1m 37s	remaining: 1m 19s
827:	learn: 0.3796591	total: 1m 37s	remaining: 1m 19s
828:	learn: 0.3796152	total: 1m 37s	remaining: 1m 19s
829:	learn: 0.3795617	total: 1m 37s	remaining: 1m 19s
830:	learn: 0.3795189	total: 1m 38s	remaining: 1m 18s
831:	learn: 0.3794666	total: 1m 38s	remaining: 1m 18s
832:	learn: 0.3794376	total: 1m 38s	remaining: 1m 18s
833:	learn: 0.3793956	total: 1m 38s	remaining: 1m 18s
834:	learn: 0.3793613	total: 1m 38s	remaining: 1m 18s
835:	learn: 0.3793052	total: 1m 38s	remaining: 1m 18s
836:	learn: 0.3792578	total: 1m 38s	remaining: 1m 18s
837:	learn: 0.3792191	total: 1m 38s	remaining: 1m 18s
838:	learn: 0.3791730	total: 1m 38s	remaining: 1m 17s
839:	learn: 0.3791247	total: 1m 39s	remaining: 1m 17s
840:	learn: 0.3790854	total: 1m 39s	remaining: 1m 17s
841:	learn: 0.3790305	total: 1m 39s	remaining: 1m 17s
842:	learn: 0.3789920	total: 1m 39s	remaining: 1m 17s
843:	learn: 0.3789605	total: 1m 39s	remaining: 1m 17s
844:	learn: 0.3789268	total: 1m 39s	remaining: 1m 17s
845:	learn: 0.3789030	total: 1m 39s	remaining: 1m 17s
846:	learn: 0.3788665	total: 1m 39s	remaining: 1m 16s
847:	learn: 0.3788173	total: 1m 39s	remaining: 1m 16s
848:	learn: 0.3787602	total: 1m 40s	remaining: 1m 16s
849:	learn: 0.3787099	total: 1m 40s	remaining: 1m 16s
850:	learn: 0.3786458	total: 1m 40s	remaining: 1m 16s
851:	learn: 0.3785983	total: 1m 40s	remaining: 1m 16s
852:	learn: 0.3785609	total: 1m 40s	remaining: 1m 16s
853:	learn: 0.3785175	total: 1m 40s	remaining: 1m 16s
854:	learn: 0.3784699	total: 1m 40s	remaining: 1m 16s
855:	learn: 0.3784160	total: 1m 41s	remaining: 1m 16s
856:	learn: 0.3783830	total: 1m 41s	remaining: 1m 15s
857:	learn: 0.3783396	total: 1m 41s	remaining: 1m 15s
858:	learn: 0.3782921	total: 1m 41s	remaining: 1m 15s
859:	learn: 0.3782403	total: 1m 41s	remaining: 1m 15s
860:	learn: 0.3781916	total: 1m 41s	remaining: 1m 15s
861:	learn: 0.3781777	total: 1m 41s	remaining: 1m 15s
862:	learn: 0.3781274	total: 1m 41s	remaining: 1m 15s
863:	learn: 0.3780619	total: 1m 41s	remaining: 1m 15s
864:	learn: 0.3780236	total: 1m 42s	remaining: 1m 14s
865:	learn: 0.3779982	total: 1m 42s	remaining: 1m 14s
866:	learn: 0.3779555	total: 1m 42s	remaining: 1m 14s
867:	learn: 0.3779040	total: 1m 42s	remaining: 1m 14s
868:	learn: 0.3778608	total: 1m 42s	remaining: 1m 14s
869:	learn: 0.3778078	total: 1m 42s	remaining: 1m 14s
870:	learn: 0.3777615	total: 1m 42s	remaining: 1m 14s
871:	learn: 0.3777198	total: 1m 42s	remaining: 1m 14s
872:	learn: 0.3776684	total: 1m 42s	remaining: 1m 13s
873:	learn: 0.3776182	total: 1m 42s	remaining: 1m 13s
874:	learn: 0.3775682	total: 1m 43s	remaining: 1m 13s
875:	learn: 0.3775278	total: 1m 43s	remaining: 1m 13s
876:	learn: 0.3774722	total: 1m 43s	remaining: 1m 13s
877:	learn: 0.3774311	total: 1m 43s	remaining: 1m 13s
878:	learn: 0.3773818	total: 1m 43s	remaining: 1m 13s
879:	learn: 0.3773538	total: 1m 43s	remaining: 1m 12s
880:	learn: 0.3773108	total: 1m 43s	remaining: 1m 12s
881:	learn: 0.3772607	total: 1m 43s	remaining: 1m 12s
882:	learn: 0.3772113	total: 1m 43s	remaining: 1m 12s
883:	learn: 0.3771628	total: 1m 44s	remaining: 1m 12s
884:	learn: 0.3771079	total: 1m 44s	remaining: 1m 12s
885:	learn: 0.3770609	total: 1m 44s	remaining: 1m 12s
886:	learn: 0.3770209	total: 1m 44s	remaining: 1m 12s
887:	learn: 0.3769676	total: 1m 44s	remaining: 1m 11s
888:	learn: 0.3769239	total: 1m 44s	remaining: 1m 11s
889:	learn: 0.3768722	total: 1m 44s	remaining: 1m 11s
890:	learn: 0.3768476	total: 1m 44s	remaining: 1m 11s
891:	learn: 0.3768026	total: 1m 44s	remaining: 1m 11s
892:	learn: 0.3767510	total: 1m 44s	remaining: 1m 11s
893:	learn: 0.3767005	total: 1m 45s	remaining: 1m 11s
894:	learn: 0.3766679	total: 1m 45s	remaining: 1m 11s
895:	learn: 0.3766341	total: 1m 45s	remaining: 1m 10s
896:	learn: 0.3765976	total: 1m 45s	remaining: 1m 10s
897:	learn: 0.3765537	total: 1m 45s	remaining: 1m 10s
898:	learn: 0.3764970	total: 1m 45s	remaining: 1m 10s
899:	learn: 0.3764403	total: 1m 45s	remaining: 1m 10s
900:	learn: 0.3764118	total: 1m 45s	remaining: 1m 10s
901:	learn: 0.3763732	total: 1m 45s	remaining: 1m 10s
902:	learn: 0.3763177	total: 1m 46s	remaining: 1m 10s
903:	learn: 0.3762650	total: 1m 46s	remaining: 1m 9s
904:	learn: 0.3762117	total: 1m 46s	remaining: 1m 9s
905:	learn: 0.3761688	total: 1m 46s	remaining: 1m 9s
906:	learn: 0.3761184	total: 1m 46s	remaining: 1m 9s
907:	learn: 0.3760669	total: 1m 46s	remaining: 1m 9s
908:	learn: 0.3760233	total: 1m 46s	remaining: 1m 9s
909:	learn: 0.3759827	total: 1m 46s	remaining: 1m 9s
910:	learn: 0.3759391	total: 1m 46s	remaining: 1m 9s
911:	learn: 0.3758962	total: 1m 46s	remaining: 1m 8s
912:	learn: 0.3758619	total: 1m 47s	remaining: 1m 8s
913:	learn: 0.3758196	total: 1m 47s	remaining: 1m 8s
914:	learn: 0.3757679	total: 1m 47s	remaining: 1m 8s
915:	learn: 0.3757137	total: 1m 47s	remaining: 1m 8s
916:	learn: 0.3756760	total: 1m 47s	remaining: 1m 8s
917:	learn: 0.3756271	total: 1m 47s	remaining: 1m 8s
918:	learn: 0.3756107	total: 1m 47s	remaining: 1m 8s
919:	learn: 0.3755551	total: 1m 47s	remaining: 1m 7s
920:	learn: 0.3755024	total: 1m 47s	remaining: 1m 7s
921:	learn: 0.3754295	total: 1m 48s	remaining: 1m 7s
922:	learn: 0.3753871	total: 1m 48s	remaining: 1m 7s
923:	learn: 0.3753402	total: 1m 48s	remaining: 1m 7s
924:	learn: 0.3753000	total: 1m 48s	remaining: 1m 7s
925:	learn: 0.3752659	total: 1m 48s	remaining: 1m 7s
926:	learn: 0.3751974	total: 1m 48s	remaining: 1m 7s
927:	learn: 0.3751376	total: 1m 48s	remaining: 1m 6s
928:	learn: 0.3750872	total: 1m 48s	remaining: 1m 6s
929:	learn: 0.3750482	total: 1m 48s	remaining: 1m 6s
930:	learn: 0.3749291	total: 1m 48s	remaining: 1m 6s
931:	learn: 0.3748799	total: 1m 49s	remaining: 1m 6s
932:	learn: 0.3748194	total: 1m 49s	remaining: 1m 6s
933:	learn: 0.3747799	total: 1m 49s	remaining: 1m 6s
934:	learn: 0.3747321	total: 1m 49s	remaining: 1m 6s
935:	learn: 0.3746840	total: 1m 49s	remaining: 1m 5s
936:	learn: 0.3746342	total: 1m 49s	remaining: 1m 5s
937:	learn: 0.3745825	total: 1m 49s	remaining: 1m 5s
938:	learn: 0.3745684	total: 1m 49s	remaining: 1m 5s
939:	learn: 0.3745176	total: 1m 49s	remaining: 1m 5s
940:	learn: 0.3744712	total: 1m 50s	remaining: 1m 5s
941:	learn: 0.3744283	total: 1m 50s	remaining: 1m 5s
942:	learn: 0.3743917	total: 1m 50s	remaining: 1m 5s
943:	learn: 0.3743409	total: 1m 50s	remaining: 1m 4s
944:	learn: 0.3742894	total: 1m 50s	remaining: 1m 4s
945:	learn: 0.3742283	total: 1m 50s	remaining: 1m 4s
946:	learn: 0.3741778	total: 1m 50s	remaining: 1m 4s
947:	learn: 0.3741365	total: 1m 50s	remaining: 1m 4s
948:	learn: 0.3741052	total: 1m 50s	remaining: 1m 4s
949:	learn: 0.3740689	total: 1m 51s	remaining: 1m 4s
950:	learn: 0.3740251	total: 1m 51s	remaining: 1m 4s
951:	learn: 0.3739772	total: 1m 51s	remaining: 1m 4s
952:	learn: 0.3739665	total: 1m 51s	remaining: 1m 3s
953:	learn: 0.3739355	total: 1m 51s	remaining: 1m 3s
954:	learn: 0.3739035	total: 1m 51s	remaining: 1m 3s
955:	learn: 0.3738715	total: 1m 51s	remaining: 1m 3s
956:	learn: 0.3738250	total: 1m 51s	remaining: 1m 3s
957:	learn: 0.3737793	total: 1m 51s	remaining: 1m 3s
958:	learn: 0.3737504	total: 1m 51s	remaining: 1m 3s
959:	learn: 0.3736977	total: 1m 52s	remaining: 1m 3s
960:	learn: 0.3736593	total: 1m 52s	remaining: 1m 2s
961:	learn: 0.3736045	total: 1m 52s	remaining: 1m 2s
962:	learn: 0.3735463	total: 1m 52s	remaining: 1m 2s
963:	learn: 0.3734916	total: 1m 52s	remaining: 1m 2s
964:	learn: 0.3734453	total: 1m 52s	remaining: 1m 2s
965:	learn: 0.3734021	total: 1m 52s	remaining: 1m 2s
966:	learn: 0.3733473	total: 1m 52s	remaining: 1m 2s
967:	learn: 0.3732960	total: 1m 52s	remaining: 1m 2s
968:	learn: 0.3732561	total: 1m 52s	remaining: 1m 1s
969:	learn: 0.3732035	total: 1m 53s	remaining: 1m 1s
970:	learn: 0.3731484	total: 1m 53s	remaining: 1m 1s
971:	learn: 0.3731122	total: 1m 53s	remaining: 1m 1s
972:	learn: 0.3730781	total: 1m 53s	remaining: 1m 1s
973:	learn: 0.3730571	total: 1m 53s	remaining: 1m 1s
974:	learn: 0.3730096	total: 1m 53s	remaining: 1m 1s
975:	learn: 0.3729623	total: 1m 53s	remaining: 1m 1s
976:	learn: 0.3729273	total: 1m 53s	remaining: 1m
977:	learn: 0.3728862	total: 1m 53s	remaining: 1m
978:	learn: 0.3728613	total: 1m 54s	remaining: 1m
979:	learn: 0.3728120	total: 1m 54s	remaining: 1m
980:	learn: 0.3727791	total: 1m 54s	remaining: 1m
981:	learn: 0.3727268	total: 1m 54s	remaining: 1m
982:	learn: 0.3726850	total: 1m 54s	remaining: 1m
983:	learn: 0.3726446	total: 1m 54s	remaining: 1m
984:	learn: 0.3726024	total: 1m 54s	remaining: 60s
985:	learn: 0.3725641	total: 1m 54s	remaining: 59.9s
986:	learn: 0.3725122	total: 1m 54s	remaining: 59.7s
987:	learn: 0.3724873	total: 1m 55s	remaining: 59.6s
988:	learn: 0.3724438	total: 1m 55s	remaining: 59.5s
989:	learn: 0.3723821	total: 1m 55s	remaining: 59.4s
990:	learn: 0.3723436	total: 1m 55s	remaining: 59.3s
991:	learn: 0.3722885	total: 1m 55s	remaining: 59.2s
992:	learn: 0.3722430	total: 1m 55s	remaining: 59.1s
993:	learn: 0.3721960	total: 1m 55s	remaining: 59s
994:	learn: 0.3721497	total: 1m 55s	remaining: 58.9s
995:	learn: 0.3721103	total: 1m 56s	remaining: 58.7s
996:	learn: 0.3720795	total: 1m 56s	remaining: 58.6s
997:	learn: 0.3720434	total: 1m 56s	remaining: 58.5s
998:	learn: 0.3720047	total: 1m 56s	remaining: 58.4s
999:	learn: 0.3719502	total: 1m 56s	remaining: 58.3s
1000:	learn: 0.3719019	total: 1m 56s	remaining: 58.2s
1001:	learn: 0.3718517	total: 1m 56s	remaining: 58s
1002:	learn: 0.3718088	total: 1m 56s	remaining: 57.9s
1003:	learn: 0.3717659	total: 1m 57s	remaining: 57.8s
1004:	learn: 0.3717161	total: 1m 57s	remaining: 57.7s
1005:	learn: 0.3716712	total: 1m 57s	remaining: 57.6s
1006:	learn: 0.3716298	total: 1m 57s	remaining: 57.5s
1007:	learn: 0.3715874	total: 1m 57s	remaining: 57.3s
1008:	learn: 0.3715319	total: 1m 57s	remaining: 57.2s
1009:	learn: 0.3714802	total: 1m 57s	remaining: 57.1s
1010:	learn: 0.3714327	total: 1m 57s	remaining: 57s
1011:	learn: 0.3713993	total: 1m 57s	remaining: 56.8s
1012:	learn: 0.3713476	total: 1m 57s	remaining: 56.7s
1013:	learn: 0.3712977	total: 1m 58s	remaining: 56.6s
1014:	learn: 0.3712418	total: 1m 58s	remaining: 56.5s
1015:	learn: 0.3711938	total: 1m 58s	remaining: 56.4s
1016:	learn: 0.3711502	total: 1m 58s	remaining: 56.3s
1017:	learn: 0.3710984	total: 1m 58s	remaining: 56.2s
1018:	learn: 0.3710672	total: 1m 58s	remaining: 56s
1019:	learn: 0.3710183	total: 1m 58s	remaining: 55.9s
1020:	learn: 0.3709838	total: 1m 58s	remaining: 55.8s
1021:	learn: 0.3709354	total: 1m 59s	remaining: 55.7s
1022:	learn: 0.3708876	total: 1m 59s	remaining: 55.6s
1023:	learn: 0.3708391	total: 1m 59s	remaining: 55.4s
1024:	learn: 0.3707921	total: 1m 59s	remaining: 55.3s
1025:	learn: 0.3707624	total: 1m 59s	remaining: 55.2s
1026:	learn: 0.3707217	total: 1m 59s	remaining: 55.1s
1027:	learn: 0.3706617	total: 1m 59s	remaining: 55s
1028:	learn: 0.3706168	total: 1m 59s	remaining: 54.8s
1029:	learn: 0.3705740	total: 1m 59s	remaining: 54.7s
1030:	learn: 0.3705247	total: 2m	remaining: 54.6s
1031:	learn: 0.3704914	total: 2m	remaining: 54.5s
1032:	learn: 0.3704402	total: 2m	remaining: 54.3s
1033:	learn: 0.3703934	total: 2m	remaining: 54.2s
1034:	learn: 0.3703410	total: 2m	remaining: 54.1s
1035:	learn: 0.3702918	total: 2m	remaining: 54s
1036:	learn: 0.3702586	total: 2m	remaining: 53.9s
1037:	learn: 0.3702351	total: 2m	remaining: 53.7s
1038:	learn: 0.3702050	total: 2m	remaining: 53.6s
1039:	learn: 0.3701591	total: 2m	remaining: 53.5s
1040:	learn: 0.3701065	total: 2m 1s	remaining: 53.4s
1041:	learn: 0.3700637	total: 2m 1s	remaining: 53.3s
1042:	learn: 0.3700111	total: 2m 1s	remaining: 53.2s
1043:	learn: 0.3699487	total: 2m 1s	remaining: 53s
1044:	learn: 0.3699464	total: 2m 1s	remaining: 52.9s
1045:	learn: 0.3699183	total: 2m 1s	remaining: 52.8s
1046:	learn: 0.3698772	total: 2m 1s	remaining: 52.7s
1047:	learn: 0.3698267	total: 2m 1s	remaining: 52.5s
1048:	learn: 0.3697904	total: 2m 1s	remaining: 52.4s
1049:	learn: 0.3697451	total: 2m 1s	remaining: 52.3s
1050:	learn: 0.3697131	total: 2m 2s	remaining: 52.1s
1051:	learn: 0.3696607	total: 2m 2s	remaining: 52s
1052:	learn: 0.3696177	total: 2m 2s	remaining: 51.9s
1053:	learn: 0.3695792	total: 2m 2s	remaining: 51.8s
1054:	learn: 0.3695369	total: 2m 2s	remaining: 51.7s
1055:	learn: 0.3694998	total: 2m 2s	remaining: 51.6s
1056:	learn: 0.3694667	total: 2m 2s	remaining: 51.4s
1057:	learn: 0.3694217	total: 2m 2s	remaining: 51.3s
1058:	learn: 0.3693905	total: 2m 2s	remaining: 51.2s
1059:	learn: 0.3693430	total: 2m 3s	remaining: 51.1s
1060:	learn: 0.3693001	total: 2m 3s	remaining: 50.9s
1061:	learn: 0.3692495	total: 2m 3s	remaining: 50.8s
1062:	learn: 0.3691948	total: 2m 3s	remaining: 50.7s
1063:	learn: 0.3691602	total: 2m 3s	remaining: 50.6s
1064:	learn: 0.3691179	total: 2m 3s	remaining: 50.5s
1065:	learn: 0.3690700	total: 2m 3s	remaining: 50.4s
1066:	learn: 0.3690207	total: 2m 3s	remaining: 50.3s
1067:	learn: 0.3689785	total: 2m 3s	remaining: 50.2s
1068:	learn: 0.3689424	total: 2m 4s	remaining: 50s
1069:	learn: 0.3688992	total: 2m 4s	remaining: 49.9s
1070:	learn: 0.3688516	total: 2m 4s	remaining: 49.8s
1071:	learn: 0.3687952	total: 2m 4s	remaining: 49.7s
1072:	learn: 0.3687621	total: 2m 4s	remaining: 49.5s
1073:	learn: 0.3687320	total: 2m 4s	remaining: 49.4s
1074:	learn: 0.3686943	total: 2m 4s	remaining: 49.3s
1075:	learn: 0.3686592	total: 2m 4s	remaining: 49.2s
1076:	learn: 0.3686026	total: 2m 4s	remaining: 49.1s
1077:	learn: 0.3685421	total: 2m 5s	remaining: 48.9s
1078:	learn: 0.3684946	total: 2m 5s	remaining: 48.8s
1079:	learn: 0.3684459	total: 2m 5s	remaining: 48.7s
1080:	learn: 0.3684074	total: 2m 5s	remaining: 48.6s
1081:	learn: 0.3683730	total: 2m 5s	remaining: 48.5s
1082:	learn: 0.3683311	total: 2m 5s	remaining: 48.3s
1083:	learn: 0.3682902	total: 2m 5s	remaining: 48.2s
1084:	learn: 0.3682683	total: 2m 5s	remaining: 48.1s
1085:	learn: 0.3682434	total: 2m 5s	remaining: 48s
1086:	learn: 0.3681939	total: 2m 5s	remaining: 47.9s
1087:	learn: 0.3681495	total: 2m 6s	remaining: 47.7s
1088:	learn: 0.3680916	total: 2m 6s	remaining: 47.6s
1089:	learn: 0.3680560	total: 2m 6s	remaining: 47.5s
1090:	learn: 0.3679998	total: 2m 6s	remaining: 47.4s
1091:	learn: 0.3679680	total: 2m 6s	remaining: 47.3s
1092:	learn: 0.3679329	total: 2m 6s	remaining: 47.2s
1093:	learn: 0.3678922	total: 2m 6s	remaining: 47.1s
1094:	learn: 0.3678642	total: 2m 6s	remaining: 46.9s
1095:	learn: 0.3678284	total: 2m 7s	remaining: 46.8s
1096:	learn: 0.3677930	total: 2m 7s	remaining: 46.7s
1097:	learn: 0.3677384	total: 2m 7s	remaining: 46.6s
1098:	learn: 0.3676936	total: 2m 7s	remaining: 46.5s
1099:	learn: 0.3676542	total: 2m 7s	remaining: 46.4s
1100:	learn: 0.3676125	total: 2m 7s	remaining: 46.2s
1101:	learn: 0.3675625	total: 2m 7s	remaining: 46.1s
1102:	learn: 0.3675286	total: 2m 7s	remaining: 46s
1103:	learn: 0.3675065	total: 2m 7s	remaining: 45.9s
1104:	learn: 0.3674540	total: 2m 7s	remaining: 45.8s
1105:	learn: 0.3673993	total: 2m 8s	remaining: 45.6s
1106:	learn: 0.3673841	total: 2m 8s	remaining: 45.5s
1107:	learn: 0.3673468	total: 2m 8s	remaining: 45.4s
1108:	learn: 0.3672970	total: 2m 8s	remaining: 45.3s
1109:	learn: 0.3672498	total: 2m 8s	remaining: 45.2s
1110:	learn: 0.3672112	total: 2m 8s	remaining: 45s
1111:	learn: 0.3671827	total: 2m 8s	remaining: 44.9s
1112:	learn: 0.3671438	total: 2m 8s	remaining: 44.8s
1113:	learn: 0.3671035	total: 2m 8s	remaining: 44.7s
1114:	learn: 0.3670527	total: 2m 9s	remaining: 44.6s
1115:	learn: 0.3670055	total: 2m 9s	remaining: 44.5s
1116:	learn: 0.3669601	total: 2m 9s	remaining: 44.3s
1117:	learn: 0.3669137	total: 2m 9s	remaining: 44.2s
1118:	learn: 0.3668736	total: 2m 9s	remaining: 44.1s
1119:	learn: 0.3668239	total: 2m 9s	remaining: 44s
1120:	learn: 0.3667706	total: 2m 9s	remaining: 43.9s
1121:	learn: 0.3667272	total: 2m 9s	remaining: 43.7s
1122:	learn: 0.3666797	total: 2m 9s	remaining: 43.6s
1123:	learn: 0.3666339	total: 2m 10s	remaining: 43.5s
1124:	learn: 0.3665931	total: 2m 10s	remaining: 43.4s
1125:	learn: 0.3665535	total: 2m 10s	remaining: 43.3s
1126:	learn: 0.3665199	total: 2m 10s	remaining: 43.1s
1127:	learn: 0.3664780	total: 2m 10s	remaining: 43s
1128:	learn: 0.3664475	total: 2m 10s	remaining: 42.9s
1129:	learn: 0.3663855	total: 2m 10s	remaining: 42.8s
1130:	learn: 0.3663461	total: 2m 10s	remaining: 42.7s
1131:	learn: 0.3663027	total: 2m 10s	remaining: 42.6s
1132:	learn: 0.3662756	total: 2m 11s	remaining: 42.5s
1133:	learn: 0.3662391	total: 2m 11s	remaining: 42.4s
1134:	learn: 0.3662015	total: 2m 11s	remaining: 42.2s
1135:	learn: 0.3661599	total: 2m 11s	remaining: 42.1s
1136:	learn: 0.3661169	total: 2m 11s	remaining: 42s
1137:	learn: 0.3660663	total: 2m 11s	remaining: 41.9s
1138:	learn: 0.3660441	total: 2m 11s	remaining: 41.8s
1139:	learn: 0.3659929	total: 2m 11s	remaining: 41.7s
1140:	learn: 0.3659651	total: 2m 12s	remaining: 41.6s
1141:	learn: 0.3659362	total: 2m 12s	remaining: 41.4s
1142:	learn: 0.3658963	total: 2m 12s	remaining: 41.3s
1143:	learn: 0.3658491	total: 2m 12s	remaining: 41.2s
1144:	learn: 0.3658126	total: 2m 12s	remaining: 41.1s
1145:	learn: 0.3657785	total: 2m 12s	remaining: 41s
1146:	learn: 0.3657352	total: 2m 12s	remaining: 40.9s
1147:	learn: 0.3656896	total: 2m 12s	remaining: 40.8s
1148:	learn: 0.3656566	total: 2m 13s	remaining: 40.6s
1149:	learn: 0.3656274	total: 2m 13s	remaining: 40.5s
1150:	learn: 0.3655861	total: 2m 13s	remaining: 40.4s
1151:	learn: 0.3655383	total: 2m 13s	remaining: 40.3s
1152:	learn: 0.3654883	total: 2m 13s	remaining: 40.2s
1153:	learn: 0.3654525	total: 2m 13s	remaining: 40.1s
1154:	learn: 0.3653996	total: 2m 13s	remaining: 40s
1155:	learn: 0.3653512	total: 2m 13s	remaining: 39.8s
1156:	learn: 0.3653237	total: 2m 14s	remaining: 39.7s
1157:	learn: 0.3652843	total: 2m 14s	remaining: 39.6s
1158:	learn: 0.3652460	total: 2m 14s	remaining: 39.5s
1159:	learn: 0.3651944	total: 2m 14s	remaining: 39.4s
1160:	learn: 0.3651478	total: 2m 14s	remaining: 39.3s
1161:	learn: 0.3650968	total: 2m 14s	remaining: 39.2s
1162:	learn: 0.3650632	total: 2m 14s	remaining: 39.1s
1163:	learn: 0.3650268	total: 2m 14s	remaining: 38.9s
1164:	learn: 0.3649796	total: 2m 15s	remaining: 38.8s
1165:	learn: 0.3649454	total: 2m 15s	remaining: 38.7s
1166:	learn: 0.3648968	total: 2m 15s	remaining: 38.6s
1167:	learn: 0.3648900	total: 2m 15s	remaining: 38.5s
1168:	learn: 0.3648611	total: 2m 15s	remaining: 38.4s
1169:	learn: 0.3648151	total: 2m 15s	remaining: 38.2s
1170:	learn: 0.3647785	total: 2m 15s	remaining: 38.1s
1171:	learn: 0.3647259	total: 2m 15s	remaining: 38s
1172:	learn: 0.3646795	total: 2m 15s	remaining: 37.9s
1173:	learn: 0.3646262	total: 2m 16s	remaining: 37.8s
1174:	learn: 0.3645805	total: 2m 16s	remaining: 37.7s
1175:	learn: 0.3645383	total: 2m 16s	remaining: 37.6s
1176:	learn: 0.3645020	total: 2m 16s	remaining: 37.5s
1177:	learn: 0.3644607	total: 2m 16s	remaining: 37.3s
1178:	learn: 0.3644235	total: 2m 16s	remaining: 37.2s
1179:	learn: 0.3643738	total: 2m 16s	remaining: 37.1s
1180:	learn: 0.3643356	total: 2m 16s	remaining: 37s
1181:	learn: 0.3642859	total: 2m 17s	remaining: 36.9s
1182:	learn: 0.3642379	total: 2m 17s	remaining: 36.8s
1183:	learn: 0.3642076	total: 2m 17s	remaining: 36.6s
1184:	learn: 0.3641707	total: 2m 17s	remaining: 36.5s
1185:	learn: 0.3641342	total: 2m 17s	remaining: 36.4s
1186:	learn: 0.3640985	total: 2m 17s	remaining: 36.3s
1187:	learn: 0.3640629	total: 2m 17s	remaining: 36.2s
1188:	learn: 0.3640360	total: 2m 17s	remaining: 36s
1189:	learn: 0.3640015	total: 2m 17s	remaining: 35.9s
1190:	learn: 0.3639689	total: 2m 17s	remaining: 35.8s
1191:	learn: 0.3639370	total: 2m 18s	remaining: 35.7s
1192:	learn: 0.3638879	total: 2m 18s	remaining: 35.6s
1193:	learn: 0.3638416	total: 2m 18s	remaining: 35.5s
1194:	learn: 0.3638082	total: 2m 18s	remaining: 35.3s
1195:	learn: 0.3637702	total: 2m 18s	remaining: 35.2s
1196:	learn: 0.3637234	total: 2m 18s	remaining: 35.1s
1197:	learn: 0.3636671	total: 2m 18s	remaining: 35s
1198:	learn: 0.3636278	total: 2m 18s	remaining: 34.9s
1199:	learn: 0.3635827	total: 2m 19s	remaining: 34.8s
1200:	learn: 0.3635353	total: 2m 19s	remaining: 34.6s
1201:	learn: 0.3634993	total: 2m 19s	remaining: 34.5s
1202:	learn: 0.3634559	total: 2m 19s	remaining: 34.4s
1203:	learn: 0.3634182	total: 2m 19s	remaining: 34.3s
1204:	learn: 0.3633819	total: 2m 19s	remaining: 34.2s
1205:	learn: 0.3633281	total: 2m 19s	remaining: 34.1s
1206:	learn: 0.3632892	total: 2m 19s	remaining: 33.9s
1207:	learn: 0.3632471	total: 2m 19s	remaining: 33.8s
1208:	learn: 0.3631987	total: 2m 20s	remaining: 33.7s
1209:	learn: 0.3631498	total: 2m 20s	remaining: 33.6s
1210:	learn: 0.3631129	total: 2m 20s	remaining: 33.5s
1211:	learn: 0.3630659	total: 2m 20s	remaining: 33.4s
1212:	learn: 0.3630189	total: 2m 20s	remaining: 33.2s
1213:	learn: 0.3629968	total: 2m 20s	remaining: 33.1s
1214:	learn: 0.3629525	total: 2m 20s	remaining: 33s
1215:	learn: 0.3629230	total: 2m 20s	remaining: 32.9s
1216:	learn: 0.3628842	total: 2m 20s	remaining: 32.8s
1217:	learn: 0.3628365	total: 2m 21s	remaining: 32.6s
1218:	learn: 0.3627906	total: 2m 21s	remaining: 32.5s
1219:	learn: 0.3627602	total: 2m 21s	remaining: 32.4s
1220:	learn: 0.3627176	total: 2m 21s	remaining: 32.3s
1221:	learn: 0.3626766	total: 2m 21s	remaining: 32.2s
1222:	learn: 0.3626288	total: 2m 21s	remaining: 32.1s
1223:	learn: 0.3625808	total: 2m 21s	remaining: 31.9s
1224:	learn: 0.3625344	total: 2m 21s	remaining: 31.8s
1225:	learn: 0.3624994	total: 2m 21s	remaining: 31.7s
1226:	learn: 0.3624679	total: 2m 21s	remaining: 31.6s
1227:	learn: 0.3624175	total: 2m 22s	remaining: 31.5s
1228:	learn: 0.3623722	total: 2m 22s	remaining: 31.4s
1229:	learn: 0.3623379	total: 2m 22s	remaining: 31.2s
1230:	learn: 0.3622971	total: 2m 22s	remaining: 31.1s
1231:	learn: 0.3622538	total: 2m 22s	remaining: 31s
1232:	learn: 0.3622247	total: 2m 22s	remaining: 30.9s
1233:	learn: 0.3621773	total: 2m 22s	remaining: 30.8s
1234:	learn: 0.3621266	total: 2m 22s	remaining: 30.7s
1235:	learn: 0.3620824	total: 2m 22s	remaining: 30.5s
1236:	learn: 0.3620504	total: 2m 23s	remaining: 30.4s
1237:	learn: 0.3620079	total: 2m 23s	remaining: 30.3s
1238:	learn: 0.3619633	total: 2m 23s	remaining: 30.2s
1239:	learn: 0.3619227	total: 2m 23s	remaining: 30.1s
1240:	learn: 0.3618927	total: 2m 23s	remaining: 30s
1241:	learn: 0.3618430	total: 2m 23s	remaining: 29.8s
1242:	learn: 0.3617994	total: 2m 23s	remaining: 29.7s
1243:	learn: 0.3617575	total: 2m 23s	remaining: 29.6s
1244:	learn: 0.3617127	total: 2m 23s	remaining: 29.5s
1245:	learn: 0.3616811	total: 2m 24s	remaining: 29.4s
1246:	learn: 0.3616408	total: 2m 24s	remaining: 29.2s
1247:	learn: 0.3616157	total: 2m 24s	remaining: 29.1s
1248:	learn: 0.3615738	total: 2m 24s	remaining: 29s
1249:	learn: 0.3615299	total: 2m 24s	remaining: 28.9s
1250:	learn: 0.3614847	total: 2m 24s	remaining: 28.8s
1251:	learn: 0.3614425	total: 2m 24s	remaining: 28.7s
1252:	learn: 0.3614009	total: 2m 24s	remaining: 28.5s
1253:	learn: 0.3613647	total: 2m 24s	remaining: 28.4s
1254:	learn: 0.3613359	total: 2m 25s	remaining: 28.3s
1255:	learn: 0.3612740	total: 2m 25s	remaining: 28.2s
1256:	learn: 0.3612352	total: 2m 25s	remaining: 28.1s
1257:	learn: 0.3611967	total: 2m 25s	remaining: 28s
1258:	learn: 0.3611474	total: 2m 25s	remaining: 27.9s
1259:	learn: 0.3611154	total: 2m 25s	remaining: 27.7s
1260:	learn: 0.3610666	total: 2m 25s	remaining: 27.6s
1261:	learn: 0.3610235	total: 2m 25s	remaining: 27.5s
1262:	learn: 0.3609904	total: 2m 25s	remaining: 27.4s
1263:	learn: 0.3609444	total: 2m 26s	remaining: 27.3s
1264:	learn: 0.3608970	total: 2m 26s	remaining: 27.2s
1265:	learn: 0.3608499	total: 2m 26s	remaining: 27s
1266:	learn: 0.3608050	total: 2m 26s	remaining: 26.9s
1267:	learn: 0.3607715	total: 2m 26s	remaining: 26.8s
1268:	learn: 0.3607348	total: 2m 26s	remaining: 26.7s
1269:	learn: 0.3606813	total: 2m 26s	remaining: 26.6s
1270:	learn: 0.3606396	total: 2m 26s	remaining: 26.5s
1271:	learn: 0.3605883	total: 2m 27s	remaining: 26.4s
1272:	learn: 0.3605395	total: 2m 27s	remaining: 26.2s
1273:	learn: 0.3604978	total: 2m 27s	remaining: 26.1s
1274:	learn: 0.3604653	total: 2m 27s	remaining: 26s
1275:	learn: 0.3604288	total: 2m 27s	remaining: 25.9s
1276:	learn: 0.3603947	total: 2m 27s	remaining: 25.8s
1277:	learn: 0.3603433	total: 2m 27s	remaining: 25.7s
1278:	learn: 0.3603068	total: 2m 27s	remaining: 25.6s
1279:	learn: 0.3602793	total: 2m 28s	remaining: 25.5s
1280:	learn: 0.3602285	total: 2m 28s	remaining: 25.4s
1281:	learn: 0.3601875	total: 2m 28s	remaining: 25.3s
1282:	learn: 0.3601303	total: 2m 28s	remaining: 25.2s
1283:	learn: 0.3600846	total: 2m 29s	remaining: 25.1s
1284:	learn: 0.3600354	total: 2m 29s	remaining: 25s
1285:	learn: 0.3599968	total: 2m 29s	remaining: 24.9s
1286:	learn: 0.3599576	total: 2m 29s	remaining: 24.7s
1287:	learn: 0.3599095	total: 2m 29s	remaining: 24.6s
1288:	learn: 0.3598647	total: 2m 29s	remaining: 24.5s
1289:	learn: 0.3598153	total: 2m 29s	remaining: 24.4s
1290:	learn: 0.3597838	total: 2m 30s	remaining: 24.3s
1291:	learn: 0.3597500	total: 2m 30s	remaining: 24.2s
1292:	learn: 0.3597091	total: 2m 30s	remaining: 24.1s
1293:	learn: 0.3596591	total: 2m 30s	remaining: 23.9s
1294:	learn: 0.3596133	total: 2m 30s	remaining: 23.8s
1295:	learn: 0.3595833	total: 2m 30s	remaining: 23.7s
1296:	learn: 0.3595506	total: 2m 30s	remaining: 23.6s
1297:	learn: 0.3595076	total: 2m 30s	remaining: 23.5s
1298:	learn: 0.3594624	total: 2m 31s	remaining: 23.4s
1299:	learn: 0.3594219	total: 2m 31s	remaining: 23.3s
1300:	learn: 0.3593802	total: 2m 31s	remaining: 23.1s
1301:	learn: 0.3593595	total: 2m 31s	remaining: 23s
1302:	learn: 0.3593090	total: 2m 31s	remaining: 22.9s
1303:	learn: 0.3592677	total: 2m 31s	remaining: 22.8s
1304:	learn: 0.3592309	total: 2m 31s	remaining: 22.7s
1305:	learn: 0.3591893	total: 2m 31s	remaining: 22.6s
1306:	learn: 0.3591485	total: 2m 31s	remaining: 22.4s
1307:	learn: 0.3590982	total: 2m 32s	remaining: 22.3s
1308:	learn: 0.3590703	total: 2m 32s	remaining: 22.2s
1309:	learn: 0.3590225	total: 2m 32s	remaining: 22.1s
1310:	learn: 0.3589728	total: 2m 32s	remaining: 22s
1311:	learn: 0.3589433	total: 2m 32s	remaining: 21.9s
1312:	learn: 0.3588978	total: 2m 32s	remaining: 21.7s
1313:	learn: 0.3588535	total: 2m 32s	remaining: 21.6s
1314:	learn: 0.3588111	total: 2m 32s	remaining: 21.5s
1315:	learn: 0.3587760	total: 2m 33s	remaining: 21.4s
1316:	learn: 0.3587267	total: 2m 33s	remaining: 21.3s
1317:	learn: 0.3586781	total: 2m 33s	remaining: 21.2s
1318:	learn: 0.3586317	total: 2m 33s	remaining: 21.1s
1319:	learn: 0.3585976	total: 2m 33s	remaining: 20.9s
1320:	learn: 0.3585601	total: 2m 33s	remaining: 20.8s
1321:	learn: 0.3585197	total: 2m 33s	remaining: 20.7s
1322:	learn: 0.3584871	total: 2m 33s	remaining: 20.6s
1323:	learn: 0.3584683	total: 2m 34s	remaining: 20.5s
1324:	learn: 0.3584363	total: 2m 34s	remaining: 20.4s
1325:	learn: 0.3584094	total: 2m 34s	remaining: 20.2s
1326:	learn: 0.3583821	total: 2m 34s	remaining: 20.1s
1327:	learn: 0.3583394	total: 2m 34s	remaining: 20s
1328:	learn: 0.3582969	total: 2m 34s	remaining: 19.9s
1329:	learn: 0.3582631	total: 2m 34s	remaining: 19.8s
1330:	learn: 0.3582099	total: 2m 34s	remaining: 19.7s
1331:	learn: 0.3581829	total: 2m 35s	remaining: 19.5s
1332:	learn: 0.3581252	total: 2m 35s	remaining: 19.4s
1333:	learn: 0.3580865	total: 2m 35s	remaining: 19.3s
1334:	learn: 0.3580554	total: 2m 35s	remaining: 19.2s
1335:	learn: 0.3580168	total: 2m 35s	remaining: 19.1s
1336:	learn: 0.3579576	total: 2m 35s	remaining: 19s
1337:	learn: 0.3579144	total: 2m 35s	remaining: 18.9s
1338:	learn: 0.3578770	total: 2m 35s	remaining: 18.7s
1339:	learn: 0.3578325	total: 2m 35s	remaining: 18.6s
1340:	learn: 0.3578041	total: 2m 36s	remaining: 18.5s
1341:	learn: 0.3577608	total: 2m 36s	remaining: 18.4s
1342:	learn: 0.3577104	total: 2m 36s	remaining: 18.3s
1343:	learn: 0.3576841	total: 2m 36s	remaining: 18.2s
1344:	learn: 0.3576332	total: 2m 36s	remaining: 18s
1345:	learn: 0.3575874	total: 2m 36s	remaining: 17.9s
1346:	learn: 0.3575464	total: 2m 36s	remaining: 17.8s
1347:	learn: 0.3575122	total: 2m 36s	remaining: 17.7s
1348:	learn: 0.3574665	total: 2m 37s	remaining: 17.6s
1349:	learn: 0.3574347	total: 2m 37s	remaining: 17.5s
1350:	learn: 0.3573881	total: 2m 37s	remaining: 17.4s
1351:	learn: 0.3573645	total: 2m 37s	remaining: 17.2s
1352:	learn: 0.3573209	total: 2m 37s	remaining: 17.1s
1353:	learn: 0.3572899	total: 2m 37s	remaining: 17s
1354:	learn: 0.3572602	total: 2m 37s	remaining: 16.9s
1355:	learn: 0.3572190	total: 2m 38s	remaining: 16.8s
1356:	learn: 0.3571976	total: 2m 38s	remaining: 16.7s
1357:	learn: 0.3571664	total: 2m 38s	remaining: 16.5s
1358:	learn: 0.3571156	total: 2m 38s	remaining: 16.4s
1359:	learn: 0.3570697	total: 2m 38s	remaining: 16.3s
1360:	learn: 0.3570267	total: 2m 38s	remaining: 16.2s
1361:	learn: 0.3569811	total: 2m 38s	remaining: 16.1s
1362:	learn: 0.3569440	total: 2m 38s	remaining: 16s
1363:	learn: 0.3569132	total: 2m 38s	remaining: 15.8s
1364:	learn: 0.3568755	total: 2m 39s	remaining: 15.7s
1365:	learn: 0.3568407	total: 2m 39s	remaining: 15.6s
1366:	learn: 0.3568049	total: 2m 39s	remaining: 15.5s
1367:	learn: 0.3567681	total: 2m 39s	remaining: 15.4s
1368:	learn: 0.3567167	total: 2m 39s	remaining: 15.3s
1369:	learn: 0.3567080	total: 2m 39s	remaining: 15.1s
1370:	learn: 0.3566695	total: 2m 39s	remaining: 15s
1371:	learn: 0.3566266	total: 2m 39s	remaining: 14.9s
1372:	learn: 0.3565952	total: 2m 39s	remaining: 14.8s
1373:	learn: 0.3565440	total: 2m 40s	remaining: 14.7s
1374:	learn: 0.3565023	total: 2m 40s	remaining: 14.6s
1375:	learn: 0.3564661	total: 2m 40s	remaining: 14.4s
1376:	learn: 0.3564301	total: 2m 40s	remaining: 14.3s
1377:	learn: 0.3563913	total: 2m 40s	remaining: 14.2s
1378:	learn: 0.3563458	total: 2m 40s	remaining: 14.1s
1379:	learn: 0.3563120	total: 2m 40s	remaining: 14s
1380:	learn: 0.3562631	total: 2m 40s	remaining: 13.9s
1381:	learn: 0.3562167	total: 2m 41s	remaining: 13.8s
1382:	learn: 0.3561732	total: 2m 41s	remaining: 13.6s
1383:	learn: 0.3561337	total: 2m 41s	remaining: 13.5s
1384:	learn: 0.3560913	total: 2m 41s	remaining: 13.4s
1385:	learn: 0.3560506	total: 2m 41s	remaining: 13.3s
1386:	learn: 0.3560228	total: 2m 41s	remaining: 13.2s
1387:	learn: 0.3559749	total: 2m 41s	remaining: 13.1s
1388:	learn: 0.3559316	total: 2m 41s	remaining: 12.9s
1389:	learn: 0.3558880	total: 2m 42s	remaining: 12.8s
1390:	learn: 0.3558556	total: 2m 42s	remaining: 12.7s
1391:	learn: 0.3558112	total: 2m 42s	remaining: 12.6s
1392:	learn: 0.3557715	total: 2m 42s	remaining: 12.5s
1393:	learn: 0.3557334	total: 2m 42s	remaining: 12.4s
1394:	learn: 0.3556874	total: 2m 42s	remaining: 12.2s
1395:	learn: 0.3556444	total: 2m 42s	remaining: 12.1s
1396:	learn: 0.3556069	total: 2m 42s	remaining: 12s
1397:	learn: 0.3555117	total: 2m 43s	remaining: 11.9s
1398:	learn: 0.3554622	total: 2m 43s	remaining: 11.8s
1399:	learn: 0.3554204	total: 2m 43s	remaining: 11.7s
1400:	learn: 0.3553858	total: 2m 43s	remaining: 11.5s
1401:	learn: 0.3553636	total: 2m 43s	remaining: 11.4s
1402:	learn: 0.3553220	total: 2m 43s	remaining: 11.3s
1403:	learn: 0.3552755	total: 2m 43s	remaining: 11.2s
1404:	learn: 0.3552296	total: 2m 43s	remaining: 11.1s
1405:	learn: 0.3551928	total: 2m 44s	remaining: 11s
1406:	learn: 0.3551630	total: 2m 44s	remaining: 10.9s
1407:	learn: 0.3551154	total: 2m 44s	remaining: 10.7s
1408:	learn: 0.3550714	total: 2m 44s	remaining: 10.6s
1409:	learn: 0.3550203	total: 2m 44s	remaining: 10.5s
1410:	learn: 0.3549807	total: 2m 44s	remaining: 10.4s
1411:	learn: 0.3549259	total: 2m 44s	remaining: 10.3s
1412:	learn: 0.3548952	total: 2m 45s	remaining: 10.2s
1413:	learn: 0.3548637	total: 2m 45s	remaining: 10s
1414:	learn: 0.3548315	total: 2m 45s	remaining: 9.93s
1415:	learn: 0.3547867	total: 2m 45s	remaining: 9.81s
1416:	learn: 0.3547476	total: 2m 45s	remaining: 9.69s
1417:	learn: 0.3546966	total: 2m 45s	remaining: 9.58s
1418:	learn: 0.3546536	total: 2m 45s	remaining: 9.46s
1419:	learn: 0.3546024	total: 2m 45s	remaining: 9.34s
1420:	learn: 0.3545566	total: 2m 45s	remaining: 9.23s
1421:	learn: 0.3545203	total: 2m 46s	remaining: 9.11s
1422:	learn: 0.3544928	total: 2m 46s	remaining: 8.99s
1423:	learn: 0.3544519	total: 2m 46s	remaining: 8.88s
1424:	learn: 0.3544159	total: 2m 46s	remaining: 8.76s
1425:	learn: 0.3543661	total: 2m 46s	remaining: 8.64s
1426:	learn: 0.3543156	total: 2m 46s	remaining: 8.53s
1427:	learn: 0.3542737	total: 2m 46s	remaining: 8.41s
1428:	learn: 0.3542504	total: 2m 46s	remaining: 8.29s
1429:	learn: 0.3542114	total: 2m 47s	remaining: 8.18s
1430:	learn: 0.3541729	total: 2m 47s	remaining: 8.06s
1431:	learn: 0.3541416	total: 2m 47s	remaining: 7.94s
1432:	learn: 0.3541043	total: 2m 47s	remaining: 7.82s
1433:	learn: 0.3540745	total: 2m 47s	remaining: 7.71s
1434:	learn: 0.3540371	total: 2m 47s	remaining: 7.59s
1435:	learn: 0.3540000	total: 2m 47s	remaining: 7.47s
1436:	learn: 0.3539615	total: 2m 47s	remaining: 7.36s
1437:	learn: 0.3539282	total: 2m 47s	remaining: 7.24s
1438:	learn: 0.3538984	total: 2m 48s	remaining: 7.12s
1439:	learn: 0.3538655	total: 2m 48s	remaining: 7s
1440:	learn: 0.3538398	total: 2m 48s	remaining: 6.89s
1441:	learn: 0.3537925	total: 2m 48s	remaining: 6.77s
1442:	learn: 0.3537625	total: 2m 48s	remaining: 6.65s
1443:	learn: 0.3537117	total: 2m 48s	remaining: 6.54s
1444:	learn: 0.3536735	total: 2m 48s	remaining: 6.42s
1445:	learn: 0.3536395	total: 2m 48s	remaining: 6.3s
1446:	learn: 0.3536009	total: 2m 48s	remaining: 6.19s
1447:	learn: 0.3535606	total: 2m 49s	remaining: 6.07s
1448:	learn: 0.3535168	total: 2m 49s	remaining: 5.95s
1449:	learn: 0.3534857	total: 2m 49s	remaining: 5.84s
1450:	learn: 0.3534390	total: 2m 49s	remaining: 5.72s
1451:	learn: 0.3533948	total: 2m 49s	remaining: 5.6s
1452:	learn: 0.3533522	total: 2m 49s	remaining: 5.49s
1453:	learn: 0.3533084	total: 2m 49s	remaining: 5.37s
1454:	learn: 0.3532843	total: 2m 49s	remaining: 5.25s
1455:	learn: 0.3532487	total: 2m 49s	remaining: 5.14s
1456:	learn: 0.3532140	total: 2m 50s	remaining: 5.02s
1457:	learn: 0.3531715	total: 2m 50s	remaining: 4.9s
1458:	learn: 0.3531345	total: 2m 50s	remaining: 4.79s
1459:	learn: 0.3531118	total: 2m 50s	remaining: 4.67s
1460:	learn: 0.3530698	total: 2m 50s	remaining: 4.55s
1461:	learn: 0.3530377	total: 2m 50s	remaining: 4.43s
1462:	learn: 0.3529946	total: 2m 50s	remaining: 4.32s
1463:	learn: 0.3529613	total: 2m 50s	remaining: 4.2s
1464:	learn: 0.3529278	total: 2m 50s	remaining: 4.08s
1465:	learn: 0.3528738	total: 2m 51s	remaining: 3.97s
1466:	learn: 0.3528461	total: 2m 51s	remaining: 3.85s
1467:	learn: 0.3528100	total: 2m 51s	remaining: 3.73s
1468:	learn: 0.3527717	total: 2m 51s	remaining: 3.62s
1469:	learn: 0.3527286	total: 2m 51s	remaining: 3.5s
1470:	learn: 0.3526794	total: 2m 51s	remaining: 3.38s
1471:	learn: 0.3526317	total: 2m 51s	remaining: 3.27s
1472:	learn: 0.3525900	total: 2m 51s	remaining: 3.15s
1473:	learn: 0.3525520	total: 2m 52s	remaining: 3.03s
1474:	learn: 0.3525365	total: 2m 52s	remaining: 2.92s
1475:	learn: 0.3524972	total: 2m 52s	remaining: 2.8s
1476:	learn: 0.3524689	total: 2m 52s	remaining: 2.68s
1477:	learn: 0.3524415	total: 2m 52s	remaining: 2.57s
1478:	learn: 0.3523915	total: 2m 52s	remaining: 2.45s
1479:	learn: 0.3523435	total: 2m 52s	remaining: 2.33s
1480:	learn: 0.3522964	total: 2m 52s	remaining: 2.22s
1481:	learn: 0.3522613	total: 2m 52s	remaining: 2.1s
1482:	learn: 0.3522212	total: 2m 53s	remaining: 1.98s
1483:	learn: 0.3521779	total: 2m 53s	remaining: 1.87s
1484:	learn: 0.3521387	total: 2m 53s	remaining: 1.75s
1485:	learn: 0.3520963	total: 2m 53s	remaining: 1.63s
1486:	learn: 0.3520444	total: 2m 53s	remaining: 1.52s
1487:	learn: 0.3520099	total: 2m 53s	remaining: 1.4s
1488:	learn: 0.3519668	total: 2m 53s	remaining: 1.28s
1489:	learn: 0.3519303	total: 2m 53s	remaining: 1.17s
1490:	learn: 0.3518822	total: 2m 53s	remaining: 1.05s
1491:	learn: 0.3518414	total: 2m 54s	remaining: 933ms
1492:	learn: 0.3518156	total: 2m 54s	remaining: 817ms
1493:	learn: 0.3517875	total: 2m 54s	remaining: 700ms
1494:	learn: 0.3517499	total: 2m 54s	remaining: 583ms
1495:	learn: 0.3516992	total: 2m 54s	remaining: 467ms
1496:	learn: 0.3516608	total: 2m 54s	remaining: 350ms
1497:	learn: 0.3516170	total: 2m 54s	remaining: 233ms
1498:	learn: 0.3515882	total: 2m 54s	remaining: 117ms
1499:	learn: 0.3515472	total: 2m 55s	remaining: 0us
Wall time: 6min 21s
Out[94]:
VotingClassifier(estimators=[('xgbc',
                              XGBClassifier(base_score=None, booster='gbtree',
                                            colsample_bylevel=None,
                                            colsample_bynode=1,
                                            colsample_bytree=1,
                                            enable_categorical=False, gamma=0,
                                            gpu_id=None, importance_type=None,
                                            interaction_constraints=None,
                                            learning_rate=0.3,
                                            max_delta_step=None, max_depth=6,
                                            min_child_weight=None, missing=nan,
                                            monotone_constraints=None,
                                            n_estimators=103, n_jobs=None,
                                            num_parallel_tree=None,
                                            predictor=None, random_state=57,
                                            reg_alpha=None, reg_lambda=None,
                                            scale_pos_weight=None, subsample=1,
                                            tree_method=None,
                                            validate_parameters=None,
                                            verbosity=None)),
                             ('lgbc',
                              LGBMClassifier(n_estimators=2000,
                                             objective='binary',
                                             random_state=57)),
                             ('catgbc',
                              <catboost.core.CatBoostClassifier object at 0x00000232C1A05760>)])

Compute FPR + FNR score

In [95]:
voting_clf_y_pred = voting_clf.predict(voting_gbc_X_valid)
valid_score = criterion(voting_clf_y_pred, voting_gbc_y_valid)
print('FPR + FNR = {}'.format(valid_score))
FPR + FNR = 0.3820401455229502

To further inspect the performance

In [96]:
# to further inspect the performance:
CM = confusion_matrix(voting_gbc_y_valid, voting_clf_y_pred)
TN, TP = CM[0, 0], CM[1, 1]
FP, FN = CM[0, 1], CM[1, 0]
print('Confusion Matrix: \n {}'.format(CM))
print('Accuracy: {}'.format((TP + TN) / (TP + TN + FP + FN)))  
print('False Positive Rate: {}'.format(FP / (FP + TN)))  
print('False Negative Rate: {}'.format(FN / (FN + TP)))
print('FPR + FNR = {}'.format(FP / (FP + TN) + FN / (FN + TP)))
plt.figure(figsize=(6,4))
plt.grid()
gb_y_prob = catgbc.predict_proba(voting_gbc_X_valid)[:, 1]
fpr, tpr, thresholds = roc_curve(voting_gbc_y_valid, gb_y_prob, pos_label=1)
idx = np.argmin(fpr + (1-tpr))
plt.plot(fpr, 1-tpr, label='RF')
plt.plot(fpr[idx], (1-tpr)[idx], '+', color='k')
plt.legend(loc='best')
plt.xlabel('FPR')
plt.ylabel('FNR')
plt.show()
Confusion Matrix: 
 [[24445  5263]
 [ 6093 23646]]
Accuracy: 0.8089726983699767
False Positive Rate: 0.17715766796822405
False Negative Rate: 0.2048824775547261
FPR + FNR = 0.3820401455229502

Result discussion

The $FPR$ and $FNR$ rate is the lower for this notebook: 0.382. The voting system provides that encompass Three boosting models give better performance than XGBoost, CatBoost and LightGBM on the binary classification task. Indeed, the combination of 3 models associated with a majority voting system definitely improves the performance on the binary classification task for this data challenge.

2dn Approach : (Soft Voting)

Predicts the class label based on the argmax of the sums of the predicted probabilities

Split whole dataset

In [14]:
voting_gbc_X_train, voting_gbc_X_valid, voting_gbc_y_train, voting_gbc_y_valid = train_test_split(X_dataframe, y, test_size=0.2, random_state=69)

Define hyperparameters and fit the model

In [15]:
%%time

estimators = [
    ('xgbc', XGBClassifier(booster='gbtree', learning_rate=0.3, 
                     max_depth=6, n_estimators=103, 
                     colsample_bynode=1, colsample_bytree=1,
                     subsample=1, gamma=0, 
                     objective='binary:logistic', random_state=69)),
    
    ('lgbc', LGBMClassifier(objective= 'binary', 
                            n_estimators = 2000, random_state=69)),
    
    ('catgbc', CatBoostClassifier(eval_metric= 'Logloss', iterations= 1500, 
                                  learning_rate= 0.1, subsample= 0.8, random_state=69))
]

voting_clf = VotingClassifier(estimators=estimators, voting='soft')

voting_clf.fit(voting_gbc_X_train, voting_gbc_y_train)
[23:15:04] WARNING: C:/Users/Administrator/workspace/xgboost-win64_release_1.5.1/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior.
0:	learn: 0.6644069	total: 123ms	remaining: 3m 4s
1:	learn: 0.6413879	total: 263ms	remaining: 3m 16s
2:	learn: 0.6217786	total: 428ms	remaining: 3m 33s
3:	learn: 0.6051211	total: 594ms	remaining: 3m 42s
4:	learn: 0.5915623	total: 770ms	remaining: 3m 50s
5:	learn: 0.5802356	total: 957ms	remaining: 3m 58s
6:	learn: 0.5701495	total: 1.2s	remaining: 4m 15s
7:	learn: 0.5619293	total: 1.39s	remaining: 4m 18s
8:	learn: 0.5541249	total: 1.61s	remaining: 4m 26s
9:	learn: 0.5476307	total: 1.8s	remaining: 4m 27s
10:	learn: 0.5417005	total: 2.03s	remaining: 4m 34s
11:	learn: 0.5359633	total: 2.2s	remaining: 4m 33s
12:	learn: 0.5315532	total: 2.4s	remaining: 4m 34s
13:	learn: 0.5280497	total: 2.58s	remaining: 4m 34s
14:	learn: 0.5242676	total: 2.76s	remaining: 4m 32s
15:	learn: 0.5205642	total: 2.92s	remaining: 4m 31s
16:	learn: 0.5176319	total: 3.08s	remaining: 4m 28s
17:	learn: 0.5151316	total: 3.25s	remaining: 4m 28s
18:	learn: 0.5127681	total: 3.4s	remaining: 4m 25s
19:	learn: 0.5100742	total: 3.57s	remaining: 4m 24s
20:	learn: 0.5075819	total: 3.71s	remaining: 4m 21s
21:	learn: 0.5052233	total: 3.87s	remaining: 4m 19s
22:	learn: 0.5039394	total: 3.98s	remaining: 4m 15s
23:	learn: 0.5018719	total: 4.13s	remaining: 4m 14s
24:	learn: 0.5005282	total: 4.25s	remaining: 4m 11s
25:	learn: 0.4989850	total: 4.4s	remaining: 4m 9s
26:	learn: 0.4977486	total: 4.52s	remaining: 4m 6s
27:	learn: 0.4963492	total: 4.64s	remaining: 4m 3s
28:	learn: 0.4949196	total: 4.75s	remaining: 4m 1s
29:	learn: 0.4931673	total: 4.91s	remaining: 4m
30:	learn: 0.4919073	total: 5.05s	remaining: 3m 59s
31:	learn: 0.4906702	total: 5.18s	remaining: 3m 57s
32:	learn: 0.4883404	total: 5.33s	remaining: 3m 57s
33:	learn: 0.4875844	total: 5.46s	remaining: 3m 55s
34:	learn: 0.4865509	total: 5.59s	remaining: 3m 53s
35:	learn: 0.4854356	total: 5.75s	remaining: 3m 53s
36:	learn: 0.4842781	total: 5.87s	remaining: 3m 52s
37:	learn: 0.4832641	total: 6.02s	remaining: 3m 51s
38:	learn: 0.4816876	total: 6.18s	remaining: 3m 51s
39:	learn: 0.4807400	total: 6.33s	remaining: 3m 50s
40:	learn: 0.4795544	total: 6.47s	remaining: 3m 50s
41:	learn: 0.4787492	total: 6.6s	remaining: 3m 49s
42:	learn: 0.4779107	total: 6.73s	remaining: 3m 47s
43:	learn: 0.4773764	total: 6.87s	remaining: 3m 47s
44:	learn: 0.4764628	total: 7.02s	remaining: 3m 47s
45:	learn: 0.4751953	total: 7.15s	remaining: 3m 45s
46:	learn: 0.4743092	total: 7.26s	remaining: 3m 44s
47:	learn: 0.4735970	total: 7.4s	remaining: 3m 43s
48:	learn: 0.4729773	total: 7.54s	remaining: 3m 43s
49:	learn: 0.4722864	total: 7.65s	remaining: 3m 41s
50:	learn: 0.4717020	total: 7.8s	remaining: 3m 41s
51:	learn: 0.4712075	total: 7.91s	remaining: 3m 40s
52:	learn: 0.4694561	total: 8.08s	remaining: 3m 40s
53:	learn: 0.4676710	total: 8.24s	remaining: 3m 40s
54:	learn: 0.4663698	total: 8.43s	remaining: 3m 41s
55:	learn: 0.4658478	total: 8.56s	remaining: 3m 40s
56:	learn: 0.4649789	total: 8.68s	remaining: 3m 39s
57:	learn: 0.4633337	total: 8.85s	remaining: 3m 40s
58:	learn: 0.4628455	total: 8.99s	remaining: 3m 39s
59:	learn: 0.4622407	total: 9.11s	remaining: 3m 38s
60:	learn: 0.4618239	total: 9.25s	remaining: 3m 38s
61:	learn: 0.4614529	total: 9.38s	remaining: 3m 37s
62:	learn: 0.4609737	total: 9.51s	remaining: 3m 36s
63:	learn: 0.4603494	total: 9.62s	remaining: 3m 35s
64:	learn: 0.4594891	total: 9.76s	remaining: 3m 35s
65:	learn: 0.4590771	total: 9.89s	remaining: 3m 34s
66:	learn: 0.4582123	total: 10.1s	remaining: 3m 35s
67:	learn: 0.4578109	total: 10.2s	remaining: 3m 34s
68:	learn: 0.4570307	total: 10.3s	remaining: 3m 34s
69:	learn: 0.4566284	total: 10.5s	remaining: 3m 34s
70:	learn: 0.4563833	total: 10.6s	remaining: 3m 34s
71:	learn: 0.4556598	total: 10.8s	remaining: 3m 34s
72:	learn: 0.4552356	total: 11s	remaining: 3m 34s
73:	learn: 0.4549035	total: 11.1s	remaining: 3m 33s
74:	learn: 0.4545183	total: 11.2s	remaining: 3m 33s
75:	learn: 0.4541772	total: 11.4s	remaining: 3m 33s
76:	learn: 0.4539356	total: 11.5s	remaining: 3m 32s
77:	learn: 0.4531922	total: 11.6s	remaining: 3m 31s
78:	learn: 0.4527316	total: 11.8s	remaining: 3m 31s
79:	learn: 0.4523630	total: 11.9s	remaining: 3m 31s
80:	learn: 0.4518140	total: 12s	remaining: 3m 30s
81:	learn: 0.4513373	total: 12.2s	remaining: 3m 30s
82:	learn: 0.4509435	total: 12.3s	remaining: 3m 29s
83:	learn: 0.4507224	total: 12.4s	remaining: 3m 28s
84:	learn: 0.4504272	total: 12.5s	remaining: 3m 28s
85:	learn: 0.4498205	total: 12.7s	remaining: 3m 28s
86:	learn: 0.4495133	total: 12.8s	remaining: 3m 27s
87:	learn: 0.4492475	total: 12.9s	remaining: 3m 27s
88:	learn: 0.4489185	total: 13s	remaining: 3m 26s
89:	learn: 0.4485178	total: 13.2s	remaining: 3m 26s
90:	learn: 0.4482531	total: 13.3s	remaining: 3m 25s
91:	learn: 0.4479470	total: 13.4s	remaining: 3m 25s
92:	learn: 0.4474081	total: 13.5s	remaining: 3m 24s
93:	learn: 0.4468284	total: 13.6s	remaining: 3m 24s
94:	learn: 0.4465667	total: 13.8s	remaining: 3m 23s
95:	learn: 0.4463136	total: 13.9s	remaining: 3m 23s
96:	learn: 0.4458362	total: 14s	remaining: 3m 22s
97:	learn: 0.4455627	total: 14.2s	remaining: 3m 22s
98:	learn: 0.4453976	total: 14.3s	remaining: 3m 22s
99:	learn: 0.4451658	total: 14.4s	remaining: 3m 21s
100:	learn: 0.4448525	total: 14.5s	remaining: 3m 21s
101:	learn: 0.4445131	total: 14.7s	remaining: 3m 21s
102:	learn: 0.4440787	total: 14.8s	remaining: 3m 20s
103:	learn: 0.4438756	total: 14.9s	remaining: 3m 20s
104:	learn: 0.4436674	total: 15.1s	remaining: 3m 20s
105:	learn: 0.4434230	total: 15.2s	remaining: 3m 19s
106:	learn: 0.4432324	total: 15.4s	remaining: 3m 19s
107:	learn: 0.4430476	total: 15.5s	remaining: 3m 19s
108:	learn: 0.4428180	total: 15.6s	remaining: 3m 18s
109:	learn: 0.4426361	total: 15.7s	remaining: 3m 18s
110:	learn: 0.4421775	total: 15.8s	remaining: 3m 18s
111:	learn: 0.4419260	total: 16s	remaining: 3m 17s
112:	learn: 0.4417161	total: 16.1s	remaining: 3m 17s
113:	learn: 0.4416039	total: 16.2s	remaining: 3m 17s
114:	learn: 0.4412265	total: 16.4s	remaining: 3m 17s
115:	learn: 0.4406634	total: 16.5s	remaining: 3m 16s
116:	learn: 0.4404406	total: 16.6s	remaining: 3m 16s
117:	learn: 0.4402110	total: 16.8s	remaining: 3m 16s
118:	learn: 0.4398846	total: 16.9s	remaining: 3m 16s
119:	learn: 0.4396244	total: 17s	remaining: 3m 15s
120:	learn: 0.4394308	total: 17.2s	remaining: 3m 16s
121:	learn: 0.4392743	total: 17.3s	remaining: 3m 15s
122:	learn: 0.4388648	total: 17.5s	remaining: 3m 15s
123:	learn: 0.4387046	total: 17.6s	remaining: 3m 15s
124:	learn: 0.4384311	total: 17.7s	remaining: 3m 15s
125:	learn: 0.4381242	total: 17.9s	remaining: 3m 14s
126:	learn: 0.4378436	total: 18.1s	remaining: 3m 15s
127:	learn: 0.4376111	total: 18.2s	remaining: 3m 14s
128:	learn: 0.4374603	total: 18.3s	remaining: 3m 14s
129:	learn: 0.4370102	total: 18.5s	remaining: 3m 14s
130:	learn: 0.4368030	total: 18.6s	remaining: 3m 14s
131:	learn: 0.4366472	total: 18.8s	remaining: 3m 14s
132:	learn: 0.4364632	total: 18.9s	remaining: 3m 14s
133:	learn: 0.4362160	total: 19.1s	remaining: 3m 14s
134:	learn: 0.4359901	total: 19.2s	remaining: 3m 13s
135:	learn: 0.4356832	total: 19.3s	remaining: 3m 13s
136:	learn: 0.4355143	total: 19.4s	remaining: 3m 13s
137:	learn: 0.4352999	total: 19.5s	remaining: 3m 12s
138:	learn: 0.4350857	total: 19.6s	remaining: 3m 12s
139:	learn: 0.4349335	total: 19.8s	remaining: 3m 12s
140:	learn: 0.4347945	total: 19.9s	remaining: 3m 11s
141:	learn: 0.4345854	total: 20s	remaining: 3m 11s
142:	learn: 0.4343473	total: 20.1s	remaining: 3m 10s
143:	learn: 0.4341620	total: 20.2s	remaining: 3m 10s
144:	learn: 0.4337380	total: 20.4s	remaining: 3m 10s
145:	learn: 0.4332087	total: 20.5s	remaining: 3m 10s
146:	learn: 0.4329967	total: 20.6s	remaining: 3m 9s
147:	learn: 0.4327843	total: 20.8s	remaining: 3m 9s
148:	learn: 0.4325485	total: 20.9s	remaining: 3m 9s
149:	learn: 0.4323416	total: 21.1s	remaining: 3m 10s
150:	learn: 0.4321525	total: 21.3s	remaining: 3m 10s
151:	learn: 0.4319864	total: 21.4s	remaining: 3m 10s
152:	learn: 0.4317987	total: 21.6s	remaining: 3m 10s
153:	learn: 0.4314799	total: 21.7s	remaining: 3m 9s
154:	learn: 0.4312609	total: 21.9s	remaining: 3m 9s
155:	learn: 0.4311178	total: 22s	remaining: 3m 9s
156:	learn: 0.4309803	total: 22.1s	remaining: 3m 9s
157:	learn: 0.4308476	total: 22.3s	remaining: 3m 9s
158:	learn: 0.4306410	total: 22.4s	remaining: 3m 8s
159:	learn: 0.4304789	total: 22.6s	remaining: 3m 8s
160:	learn: 0.4302681	total: 22.7s	remaining: 3m 9s
161:	learn: 0.4301279	total: 22.9s	remaining: 3m 9s
162:	learn: 0.4299697	total: 23.1s	remaining: 3m 9s
163:	learn: 0.4298275	total: 23.2s	remaining: 3m 9s
164:	learn: 0.4293664	total: 23.4s	remaining: 3m 9s
165:	learn: 0.4290906	total: 23.5s	remaining: 3m 9s
166:	learn: 0.4289613	total: 23.7s	remaining: 3m 8s
167:	learn: 0.4288313	total: 23.8s	remaining: 3m 8s
168:	learn: 0.4285768	total: 23.9s	remaining: 3m 8s
169:	learn: 0.4284080	total: 24s	remaining: 3m 7s
170:	learn: 0.4281658	total: 24.1s	remaining: 3m 7s
171:	learn: 0.4280283	total: 24.3s	remaining: 3m 7s
172:	learn: 0.4279123	total: 24.4s	remaining: 3m 7s
173:	learn: 0.4277730	total: 24.5s	remaining: 3m 6s
174:	learn: 0.4275886	total: 24.7s	remaining: 3m 6s
175:	learn: 0.4274294	total: 24.8s	remaining: 3m 6s
176:	learn: 0.4273086	total: 24.9s	remaining: 3m 6s
177:	learn: 0.4271649	total: 25s	remaining: 3m 5s
178:	learn: 0.4269883	total: 25.2s	remaining: 3m 5s
179:	learn: 0.4268556	total: 25.3s	remaining: 3m 5s
180:	learn: 0.4266914	total: 25.4s	remaining: 3m 5s
181:	learn: 0.4265456	total: 25.6s	remaining: 3m 5s
182:	learn: 0.4264059	total: 25.7s	remaining: 3m 5s
183:	learn: 0.4262883	total: 25.8s	remaining: 3m 4s
184:	learn: 0.4261471	total: 26s	remaining: 3m 4s
185:	learn: 0.4260455	total: 26.1s	remaining: 3m 4s
186:	learn: 0.4259176	total: 26.2s	remaining: 3m 3s
187:	learn: 0.4256978	total: 26.3s	remaining: 3m 3s
188:	learn: 0.4254563	total: 26.4s	remaining: 3m 3s
189:	learn: 0.4251629	total: 26.6s	remaining: 3m 3s
190:	learn: 0.4248331	total: 26.7s	remaining: 3m 3s
191:	learn: 0.4246900	total: 26.8s	remaining: 3m 2s
192:	learn: 0.4243781	total: 27s	remaining: 3m 2s
193:	learn: 0.4241714	total: 27.1s	remaining: 3m 2s
194:	learn: 0.4240777	total: 27.2s	remaining: 3m 2s
195:	learn: 0.4239680	total: 27.3s	remaining: 3m 1s
196:	learn: 0.4237142	total: 27.5s	remaining: 3m 1s
197:	learn: 0.4234031	total: 27.6s	remaining: 3m 1s
198:	learn: 0.4232093	total: 27.7s	remaining: 3m 1s
199:	learn: 0.4229804	total: 27.8s	remaining: 3m
200:	learn: 0.4228383	total: 27.9s	remaining: 3m
201:	learn: 0.4226886	total: 28.1s	remaining: 3m
202:	learn: 0.4224621	total: 28.2s	remaining: 3m
203:	learn: 0.4223539	total: 28.3s	remaining: 2m 59s
204:	learn: 0.4222476	total: 28.4s	remaining: 2m 59s
205:	learn: 0.4221295	total: 28.5s	remaining: 2m 59s
206:	learn: 0.4219616	total: 28.7s	remaining: 2m 59s
207:	learn: 0.4218513	total: 28.8s	remaining: 2m 58s
208:	learn: 0.4217146	total: 28.9s	remaining: 2m 58s
209:	learn: 0.4215542	total: 29.1s	remaining: 2m 58s
210:	learn: 0.4214381	total: 29.2s	remaining: 2m 58s
211:	learn: 0.4213195	total: 29.3s	remaining: 2m 57s
212:	learn: 0.4212386	total: 29.5s	remaining: 2m 57s
213:	learn: 0.4211348	total: 29.6s	remaining: 2m 57s
214:	learn: 0.4210553	total: 29.7s	remaining: 2m 57s
215:	learn: 0.4208880	total: 29.9s	remaining: 2m 57s
216:	learn: 0.4207752	total: 30s	remaining: 2m 57s
217:	learn: 0.4206132	total: 30.1s	remaining: 2m 57s
218:	learn: 0.4205163	total: 30.2s	remaining: 2m 56s
219:	learn: 0.4204128	total: 30.4s	remaining: 2m 56s
220:	learn: 0.4202436	total: 30.5s	remaining: 2m 56s
221:	learn: 0.4200981	total: 30.6s	remaining: 2m 56s
222:	learn: 0.4199860	total: 30.8s	remaining: 2m 56s
223:	learn: 0.4198822	total: 30.9s	remaining: 2m 55s
224:	learn: 0.4197698	total: 31s	remaining: 2m 55s
225:	learn: 0.4195982	total: 31.1s	remaining: 2m 55s
226:	learn: 0.4194708	total: 31.3s	remaining: 2m 55s
227:	learn: 0.4193695	total: 31.4s	remaining: 2m 55s
228:	learn: 0.4191938	total: 31.5s	remaining: 2m 54s
229:	learn: 0.4190571	total: 31.6s	remaining: 2m 54s
230:	learn: 0.4189667	total: 31.7s	remaining: 2m 54s
231:	learn: 0.4187854	total: 31.9s	remaining: 2m 54s
232:	learn: 0.4186278	total: 32s	remaining: 2m 54s
233:	learn: 0.4185247	total: 32.1s	remaining: 2m 53s
234:	learn: 0.4184386	total: 32.3s	remaining: 2m 53s
235:	learn: 0.4183115	total: 32.4s	remaining: 2m 53s
236:	learn: 0.4181942	total: 32.6s	remaining: 2m 53s
237:	learn: 0.4180493	total: 32.7s	remaining: 2m 53s
238:	learn: 0.4179476	total: 32.9s	remaining: 2m 53s
239:	learn: 0.4178163	total: 33.1s	remaining: 2m 53s
240:	learn: 0.4177006	total: 33.2s	remaining: 2m 53s
241:	learn: 0.4175679	total: 33.4s	remaining: 2m 53s
242:	learn: 0.4174576	total: 33.5s	remaining: 2m 53s
243:	learn: 0.4173666	total: 33.7s	remaining: 2m 53s
244:	learn: 0.4172643	total: 33.8s	remaining: 2m 53s
245:	learn: 0.4171686	total: 34s	remaining: 2m 53s
246:	learn: 0.4170666	total: 34.1s	remaining: 2m 52s
247:	learn: 0.4170008	total: 34.2s	remaining: 2m 52s
248:	learn: 0.4168826	total: 34.4s	remaining: 2m 52s
249:	learn: 0.4167331	total: 34.5s	remaining: 2m 52s
250:	learn: 0.4166416	total: 34.7s	remaining: 2m 52s
251:	learn: 0.4164889	total: 34.8s	remaining: 2m 52s
252:	learn: 0.4163816	total: 34.9s	remaining: 2m 52s
253:	learn: 0.4163004	total: 35.1s	remaining: 2m 52s
254:	learn: 0.4161985	total: 35.2s	remaining: 2m 51s
255:	learn: 0.4160559	total: 35.3s	remaining: 2m 51s
256:	learn: 0.4159812	total: 35.5s	remaining: 2m 51s
257:	learn: 0.4159050	total: 35.6s	remaining: 2m 51s
258:	learn: 0.4157950	total: 35.7s	remaining: 2m 51s
259:	learn: 0.4156911	total: 35.8s	remaining: 2m 50s
260:	learn: 0.4155733	total: 36s	remaining: 2m 50s
261:	learn: 0.4154630	total: 36.1s	remaining: 2m 50s
262:	learn: 0.4153513	total: 36.2s	remaining: 2m 50s
263:	learn: 0.4152842	total: 36.4s	remaining: 2m 50s
264:	learn: 0.4151473	total: 36.5s	remaining: 2m 50s
265:	learn: 0.4150826	total: 36.6s	remaining: 2m 49s
266:	learn: 0.4149359	total: 36.7s	remaining: 2m 49s
267:	learn: 0.4148697	total: 36.8s	remaining: 2m 49s
268:	learn: 0.4147928	total: 37s	remaining: 2m 49s
269:	learn: 0.4146618	total: 37.1s	remaining: 2m 48s
270:	learn: 0.4145741	total: 37.2s	remaining: 2m 48s
271:	learn: 0.4144898	total: 37.3s	remaining: 2m 48s
272:	learn: 0.4144162	total: 37.5s	remaining: 2m 48s
273:	learn: 0.4143429	total: 37.6s	remaining: 2m 48s
274:	learn: 0.4142646	total: 37.7s	remaining: 2m 47s
275:	learn: 0.4141635	total: 37.8s	remaining: 2m 47s
276:	learn: 0.4140915	total: 37.9s	remaining: 2m 47s
277:	learn: 0.4139964	total: 38s	remaining: 2m 47s
278:	learn: 0.4138748	total: 38.2s	remaining: 2m 47s
279:	learn: 0.4137460	total: 38.3s	remaining: 2m 46s
280:	learn: 0.4136608	total: 38.4s	remaining: 2m 46s
281:	learn: 0.4135683	total: 38.6s	remaining: 2m 46s
282:	learn: 0.4134942	total: 38.7s	remaining: 2m 46s
283:	learn: 0.4133965	total: 38.8s	remaining: 2m 46s
284:	learn: 0.4132836	total: 38.9s	remaining: 2m 45s
285:	learn: 0.4132210	total: 39s	remaining: 2m 45s
286:	learn: 0.4131112	total: 39.2s	remaining: 2m 45s
287:	learn: 0.4130227	total: 39.3s	remaining: 2m 45s
288:	learn: 0.4129268	total: 39.4s	remaining: 2m 45s
289:	learn: 0.4128381	total: 39.5s	remaining: 2m 44s
290:	learn: 0.4127648	total: 39.6s	remaining: 2m 44s
291:	learn: 0.4126948	total: 39.8s	remaining: 2m 44s
292:	learn: 0.4125583	total: 39.9s	remaining: 2m 44s
293:	learn: 0.4124660	total: 40s	remaining: 2m 44s
294:	learn: 0.4123640	total: 40.1s	remaining: 2m 43s
295:	learn: 0.4122789	total: 40.3s	remaining: 2m 43s
296:	learn: 0.4122040	total: 40.4s	remaining: 2m 43s
297:	learn: 0.4121318	total: 40.5s	remaining: 2m 43s
298:	learn: 0.4120417	total: 40.7s	remaining: 2m 43s
299:	learn: 0.4119681	total: 40.8s	remaining: 2m 43s
300:	learn: 0.4118818	total: 40.9s	remaining: 2m 42s
301:	learn: 0.4117822	total: 41s	remaining: 2m 42s
302:	learn: 0.4116918	total: 41.4s	remaining: 2m 43s
303:	learn: 0.4115801	total: 41.5s	remaining: 2m 43s
304:	learn: 0.4114931	total: 41.7s	remaining: 2m 43s
305:	learn: 0.4114291	total: 41.8s	remaining: 2m 43s
306:	learn: 0.4113215	total: 42s	remaining: 2m 43s
307:	learn: 0.4112491	total: 42.1s	remaining: 2m 43s
308:	learn: 0.4111650	total: 42.3s	remaining: 2m 42s
309:	learn: 0.4111076	total: 42.4s	remaining: 2m 42s
310:	learn: 0.4110142	total: 42.6s	remaining: 2m 42s
311:	learn: 0.4109105	total: 42.7s	remaining: 2m 42s
312:	learn: 0.4108364	total: 42.8s	remaining: 2m 42s
313:	learn: 0.4107501	total: 43s	remaining: 2m 42s
314:	learn: 0.4106352	total: 43.1s	remaining: 2m 42s
315:	learn: 0.4105547	total: 43.3s	remaining: 2m 42s
316:	learn: 0.4104820	total: 43.4s	remaining: 2m 41s
317:	learn: 0.4103838	total: 43.5s	remaining: 2m 41s
318:	learn: 0.4103083	total: 43.7s	remaining: 2m 41s
319:	learn: 0.4102127	total: 43.9s	remaining: 2m 41s
320:	learn: 0.4101327	total: 44.1s	remaining: 2m 41s
321:	learn: 0.4100511	total: 44.2s	remaining: 2m 41s
322:	learn: 0.4099918	total: 44.4s	remaining: 2m 41s
323:	learn: 0.4099156	total: 44.5s	remaining: 2m 41s
324:	learn: 0.4098480	total: 44.7s	remaining: 2m 41s
325:	learn: 0.4097526	total: 44.9s	remaining: 2m 41s
326:	learn: 0.4096916	total: 45s	remaining: 2m 41s
327:	learn: 0.4096110	total: 45.2s	remaining: 2m 41s
328:	learn: 0.4095292	total: 45.3s	remaining: 2m 41s
329:	learn: 0.4094510	total: 45.5s	remaining: 2m 41s
330:	learn: 0.4093776	total: 45.6s	remaining: 2m 41s
331:	learn: 0.4093215	total: 45.8s	remaining: 2m 41s
332:	learn: 0.4092121	total: 46s	remaining: 2m 41s
333:	learn: 0.4091326	total: 46.1s	remaining: 2m 41s
334:	learn: 0.4090361	total: 46.3s	remaining: 2m 41s
335:	learn: 0.4089558	total: 46.4s	remaining: 2m 40s
336:	learn: 0.4089151	total: 46.6s	remaining: 2m 40s
337:	learn: 0.4088619	total: 46.7s	remaining: 2m 40s
338:	learn: 0.4087813	total: 46.8s	remaining: 2m 40s
339:	learn: 0.4086873	total: 47s	remaining: 2m 40s
340:	learn: 0.4086222	total: 47.2s	remaining: 2m 40s
341:	learn: 0.4085416	total: 47.3s	remaining: 2m 40s
342:	learn: 0.4084692	total: 47.4s	remaining: 2m 39s
343:	learn: 0.4083817	total: 47.6s	remaining: 2m 39s
344:	learn: 0.4083084	total: 47.7s	remaining: 2m 39s
345:	learn: 0.4082559	total: 47.9s	remaining: 2m 39s
346:	learn: 0.4081769	total: 48s	remaining: 2m 39s
347:	learn: 0.4081004	total: 48.2s	remaining: 2m 39s
348:	learn: 0.4080280	total: 48.3s	remaining: 2m 39s
349:	learn: 0.4079787	total: 48.5s	remaining: 2m 39s
350:	learn: 0.4079247	total: 48.6s	remaining: 2m 39s
351:	learn: 0.4078407	total: 48.8s	remaining: 2m 39s
352:	learn: 0.4077700	total: 48.9s	remaining: 2m 38s
353:	learn: 0.4077017	total: 49.1s	remaining: 2m 38s
354:	learn: 0.4076311	total: 49.2s	remaining: 2m 38s
355:	learn: 0.4075688	total: 49.3s	remaining: 2m 38s
356:	learn: 0.4075019	total: 49.5s	remaining: 2m 38s
357:	learn: 0.4074409	total: 49.6s	remaining: 2m 38s
358:	learn: 0.4073387	total: 49.8s	remaining: 2m 38s
359:	learn: 0.4072915	total: 50s	remaining: 2m 38s
360:	learn: 0.4072038	total: 50.2s	remaining: 2m 38s
361:	learn: 0.4071108	total: 50.3s	remaining: 2m 38s
362:	learn: 0.4070467	total: 50.5s	remaining: 2m 38s
363:	learn: 0.4069615	total: 50.7s	remaining: 2m 38s
364:	learn: 0.4068665	total: 50.8s	remaining: 2m 38s
365:	learn: 0.4067898	total: 51s	remaining: 2m 37s
366:	learn: 0.4067391	total: 51.1s	remaining: 2m 37s
367:	learn: 0.4066601	total: 51.3s	remaining: 2m 37s
368:	learn: 0.4065916	total: 51.4s	remaining: 2m 37s
369:	learn: 0.4064870	total: 51.6s	remaining: 2m 37s
370:	learn: 0.4063657	total: 51.7s	remaining: 2m 37s
371:	learn: 0.4062972	total: 51.9s	remaining: 2m 37s
372:	learn: 0.4062160	total: 52.1s	remaining: 2m 37s
373:	learn: 0.4061454	total: 52.2s	remaining: 2m 37s
374:	learn: 0.4060838	total: 52.4s	remaining: 2m 37s
375:	learn: 0.4060124	total: 52.5s	remaining: 2m 37s
376:	learn: 0.4059393	total: 52.7s	remaining: 2m 36s
377:	learn: 0.4058750	total: 52.9s	remaining: 2m 36s
378:	learn: 0.4057952	total: 53s	remaining: 2m 36s
379:	learn: 0.4057255	total: 53.2s	remaining: 2m 36s
380:	learn: 0.4056525	total: 53.3s	remaining: 2m 36s
381:	learn: 0.4055516	total: 53.4s	remaining: 2m 36s
382:	learn: 0.4054641	total: 53.6s	remaining: 2m 36s
383:	learn: 0.4054035	total: 53.7s	remaining: 2m 36s
384:	learn: 0.4053226	total: 53.9s	remaining: 2m 36s
385:	learn: 0.4052558	total: 54s	remaining: 2m 35s
386:	learn: 0.4051715	total: 54.1s	remaining: 2m 35s
387:	learn: 0.4050978	total: 54.3s	remaining: 2m 35s
388:	learn: 0.4050131	total: 54.4s	remaining: 2m 35s
389:	learn: 0.4049430	total: 54.5s	remaining: 2m 35s
390:	learn: 0.4048814	total: 54.6s	remaining: 2m 34s
391:	learn: 0.4048137	total: 54.8s	remaining: 2m 34s
392:	learn: 0.4047488	total: 54.9s	remaining: 2m 34s
393:	learn: 0.4046806	total: 55.1s	remaining: 2m 34s
394:	learn: 0.4046151	total: 55.2s	remaining: 2m 34s
395:	learn: 0.4045509	total: 55.3s	remaining: 2m 34s
396:	learn: 0.4044824	total: 55.4s	remaining: 2m 34s
397:	learn: 0.4043984	total: 55.6s	remaining: 2m 33s
398:	learn: 0.4043501	total: 55.7s	remaining: 2m 33s
399:	learn: 0.4042601	total: 55.8s	remaining: 2m 33s
400:	learn: 0.4042084	total: 55.9s	remaining: 2m 33s
401:	learn: 0.4041741	total: 56s	remaining: 2m 33s
402:	learn: 0.4041015	total: 56.2s	remaining: 2m 33s
403:	learn: 0.4040588	total: 56.3s	remaining: 2m 32s
404:	learn: 0.4039929	total: 56.4s	remaining: 2m 32s
405:	learn: 0.4039013	total: 56.6s	remaining: 2m 32s
406:	learn: 0.4038477	total: 56.7s	remaining: 2m 32s
407:	learn: 0.4037812	total: 56.8s	remaining: 2m 32s
408:	learn: 0.4037010	total: 57s	remaining: 2m 32s
409:	learn: 0.4036403	total: 57.1s	remaining: 2m 31s
410:	learn: 0.4035789	total: 57.3s	remaining: 2m 31s
411:	learn: 0.4035145	total: 57.4s	remaining: 2m 31s
412:	learn: 0.4034260	total: 57.5s	remaining: 2m 31s
413:	learn: 0.4033491	total: 57.7s	remaining: 2m 31s
414:	learn: 0.4033116	total: 57.8s	remaining: 2m 31s
415:	learn: 0.4032482	total: 57.9s	remaining: 2m 30s
416:	learn: 0.4031826	total: 58s	remaining: 2m 30s
417:	learn: 0.4031299	total: 58.1s	remaining: 2m 30s
418:	learn: 0.4030732	total: 58.3s	remaining: 2m 30s
419:	learn: 0.4029900	total: 58.4s	remaining: 2m 30s
420:	learn: 0.4029020	total: 58.5s	remaining: 2m 30s
421:	learn: 0.4028547	total: 58.7s	remaining: 2m 29s
422:	learn: 0.4027830	total: 58.8s	remaining: 2m 29s
423:	learn: 0.4027218	total: 58.9s	remaining: 2m 29s
424:	learn: 0.4026541	total: 59s	remaining: 2m 29s
425:	learn: 0.4025655	total: 59.2s	remaining: 2m 29s
426:	learn: 0.4024944	total: 59.3s	remaining: 2m 29s
427:	learn: 0.4024454	total: 59.4s	remaining: 2m 28s
428:	learn: 0.4023702	total: 59.6s	remaining: 2m 28s
429:	learn: 0.4022953	total: 59.7s	remaining: 2m 28s
430:	learn: 0.4022323	total: 59.8s	remaining: 2m 28s
431:	learn: 0.4021625	total: 59.9s	remaining: 2m 28s
432:	learn: 0.4021084	total: 1m	remaining: 2m 27s
433:	learn: 0.4020481	total: 1m	remaining: 2m 27s
434:	learn: 0.4019971	total: 1m	remaining: 2m 27s
435:	learn: 0.4019473	total: 1m	remaining: 2m 27s
436:	learn: 0.4018745	total: 1m	remaining: 2m 27s
437:	learn: 0.4018007	total: 1m	remaining: 2m 27s
438:	learn: 0.4017329	total: 1m	remaining: 2m 26s
439:	learn: 0.4016884	total: 1m	remaining: 2m 26s
440:	learn: 0.4016144	total: 1m 1s	remaining: 2m 26s
441:	learn: 0.4015413	total: 1m 1s	remaining: 2m 26s
442:	learn: 0.4014716	total: 1m 1s	remaining: 2m 26s
443:	learn: 0.4014152	total: 1m 1s	remaining: 2m 26s
444:	learn: 0.4013880	total: 1m 1s	remaining: 2m 25s
445:	learn: 0.4013376	total: 1m 1s	remaining: 2m 25s
446:	learn: 0.4012669	total: 1m 1s	remaining: 2m 25s
447:	learn: 0.4011913	total: 1m 1s	remaining: 2m 25s
448:	learn: 0.4011209	total: 1m 2s	remaining: 2m 25s
449:	learn: 0.4010629	total: 1m 2s	remaining: 2m 25s
450:	learn: 0.4009927	total: 1m 2s	remaining: 2m 25s
451:	learn: 0.4009375	total: 1m 2s	remaining: 2m 24s
452:	learn: 0.4008759	total: 1m 2s	remaining: 2m 24s
453:	learn: 0.4008484	total: 1m 2s	remaining: 2m 24s
454:	learn: 0.4007839	total: 1m 2s	remaining: 2m 24s
455:	learn: 0.4007154	total: 1m 3s	remaining: 2m 24s
456:	learn: 0.4006611	total: 1m 3s	remaining: 2m 24s
457:	learn: 0.4005519	total: 1m 3s	remaining: 2m 23s
458:	learn: 0.4004870	total: 1m 3s	remaining: 2m 23s
459:	learn: 0.4004257	total: 1m 3s	remaining: 2m 23s
460:	learn: 0.4003759	total: 1m 3s	remaining: 2m 23s
461:	learn: 0.4003167	total: 1m 3s	remaining: 2m 23s
462:	learn: 0.4002557	total: 1m 3s	remaining: 2m 23s
463:	learn: 0.4001951	total: 1m 4s	remaining: 2m 22s
464:	learn: 0.4001335	total: 1m 4s	remaining: 2m 22s
465:	learn: 0.4000812	total: 1m 4s	remaining: 2m 22s
466:	learn: 0.4000027	total: 1m 4s	remaining: 2m 22s
467:	learn: 0.3999457	total: 1m 4s	remaining: 2m 22s
468:	learn: 0.3998807	total: 1m 4s	remaining: 2m 22s
469:	learn: 0.3998550	total: 1m 4s	remaining: 2m 22s
470:	learn: 0.3997821	total: 1m 5s	remaining: 2m 22s
471:	learn: 0.3997138	total: 1m 5s	remaining: 2m 22s
472:	learn: 0.3996471	total: 1m 5s	remaining: 2m 21s
473:	learn: 0.3995858	total: 1m 5s	remaining: 2m 21s
474:	learn: 0.3995379	total: 1m 5s	remaining: 2m 21s
475:	learn: 0.3994610	total: 1m 5s	remaining: 2m 21s
476:	learn: 0.3993910	total: 1m 5s	remaining: 2m 21s
477:	learn: 0.3993276	total: 1m 6s	remaining: 2m 21s
478:	learn: 0.3992736	total: 1m 6s	remaining: 2m 21s
479:	learn: 0.3992190	total: 1m 6s	remaining: 2m 20s
480:	learn: 0.3991678	total: 1m 6s	remaining: 2m 20s
481:	learn: 0.3991011	total: 1m 6s	remaining: 2m 20s
482:	learn: 0.3990449	total: 1m 6s	remaining: 2m 20s
483:	learn: 0.3989953	total: 1m 6s	remaining: 2m 20s
484:	learn: 0.3989300	total: 1m 7s	remaining: 2m 20s
485:	learn: 0.3988793	total: 1m 7s	remaining: 2m 20s
486:	learn: 0.3988255	total: 1m 7s	remaining: 2m 19s
487:	learn: 0.3987740	total: 1m 7s	remaining: 2m 19s
488:	learn: 0.3987257	total: 1m 7s	remaining: 2m 19s
489:	learn: 0.3986632	total: 1m 7s	remaining: 2m 19s
490:	learn: 0.3986393	total: 1m 7s	remaining: 2m 19s
491:	learn: 0.3985743	total: 1m 7s	remaining: 2m 19s
492:	learn: 0.3985211	total: 1m 8s	remaining: 2m 18s
493:	learn: 0.3984556	total: 1m 8s	remaining: 2m 18s
494:	learn: 0.3984159	total: 1m 8s	remaining: 2m 18s
495:	learn: 0.3983471	total: 1m 8s	remaining: 2m 18s
496:	learn: 0.3982859	total: 1m 8s	remaining: 2m 18s
497:	learn: 0.3982306	total: 1m 8s	remaining: 2m 18s
498:	learn: 0.3981681	total: 1m 8s	remaining: 2m 17s
499:	learn: 0.3981084	total: 1m 8s	remaining: 2m 17s
500:	learn: 0.3980672	total: 1m 8s	remaining: 2m 17s
501:	learn: 0.3979762	total: 1m 9s	remaining: 2m 17s
502:	learn: 0.3979283	total: 1m 9s	remaining: 2m 17s
503:	learn: 0.3978923	total: 1m 9s	remaining: 2m 17s
504:	learn: 0.3978351	total: 1m 9s	remaining: 2m 16s
505:	learn: 0.3977707	total: 1m 9s	remaining: 2m 16s
506:	learn: 0.3977045	total: 1m 9s	remaining: 2m 16s
507:	learn: 0.3976422	total: 1m 9s	remaining: 2m 16s
508:	learn: 0.3975891	total: 1m 10s	remaining: 2m 16s
509:	learn: 0.3975490	total: 1m 10s	remaining: 2m 16s
510:	learn: 0.3975082	total: 1m 10s	remaining: 2m 15s
511:	learn: 0.3974560	total: 1m 10s	remaining: 2m 15s
512:	learn: 0.3973840	total: 1m 10s	remaining: 2m 15s
513:	learn: 0.3973239	total: 1m 10s	remaining: 2m 15s
514:	learn: 0.3972629	total: 1m 10s	remaining: 2m 15s
515:	learn: 0.3972077	total: 1m 10s	remaining: 2m 15s
516:	learn: 0.3971530	total: 1m 11s	remaining: 2m 15s
517:	learn: 0.3970931	total: 1m 11s	remaining: 2m 14s
518:	learn: 0.3970389	total: 1m 11s	remaining: 2m 14s
519:	learn: 0.3969755	total: 1m 11s	remaining: 2m 14s
520:	learn: 0.3969126	total: 1m 11s	remaining: 2m 14s
521:	learn: 0.3968778	total: 1m 11s	remaining: 2m 14s
522:	learn: 0.3968427	total: 1m 11s	remaining: 2m 14s
523:	learn: 0.3967845	total: 1m 11s	remaining: 2m 13s
524:	learn: 0.3967227	total: 1m 12s	remaining: 2m 13s
525:	learn: 0.3966617	total: 1m 12s	remaining: 2m 13s
526:	learn: 0.3966116	total: 1m 12s	remaining: 2m 13s
527:	learn: 0.3965508	total: 1m 12s	remaining: 2m 13s
528:	learn: 0.3964860	total: 1m 12s	remaining: 2m 13s
529:	learn: 0.3964414	total: 1m 12s	remaining: 2m 13s
530:	learn: 0.3963828	total: 1m 12s	remaining: 2m 12s
531:	learn: 0.3963186	total: 1m 12s	remaining: 2m 12s
532:	learn: 0.3962508	total: 1m 13s	remaining: 2m 12s
533:	learn: 0.3961832	total: 1m 13s	remaining: 2m 12s
534:	learn: 0.3961282	total: 1m 13s	remaining: 2m 12s
535:	learn: 0.3960586	total: 1m 13s	remaining: 2m 12s
536:	learn: 0.3960222	total: 1m 13s	remaining: 2m 12s
537:	learn: 0.3959752	total: 1m 13s	remaining: 2m 11s
538:	learn: 0.3958990	total: 1m 13s	remaining: 2m 11s
539:	learn: 0.3958612	total: 1m 13s	remaining: 2m 11s
540:	learn: 0.3958258	total: 1m 14s	remaining: 2m 11s
541:	learn: 0.3957762	total: 1m 14s	remaining: 2m 11s
542:	learn: 0.3957090	total: 1m 14s	remaining: 2m 11s
543:	learn: 0.3956648	total: 1m 14s	remaining: 2m 10s
544:	learn: 0.3956079	total: 1m 14s	remaining: 2m 10s
545:	learn: 0.3955480	total: 1m 14s	remaining: 2m 10s
546:	learn: 0.3954897	total: 1m 14s	remaining: 2m 10s
547:	learn: 0.3954222	total: 1m 14s	remaining: 2m 10s
548:	learn: 0.3953872	total: 1m 15s	remaining: 2m 10s
549:	learn: 0.3953321	total: 1m 15s	remaining: 2m 9s
550:	learn: 0.3952689	total: 1m 15s	remaining: 2m 9s
551:	learn: 0.3952266	total: 1m 15s	remaining: 2m 9s
552:	learn: 0.3951814	total: 1m 15s	remaining: 2m 9s
553:	learn: 0.3951294	total: 1m 15s	remaining: 2m 9s
554:	learn: 0.3950664	total: 1m 15s	remaining: 2m 9s
555:	learn: 0.3950086	total: 1m 15s	remaining: 2m 8s
556:	learn: 0.3949377	total: 1m 16s	remaining: 2m 8s
557:	learn: 0.3948868	total: 1m 16s	remaining: 2m 8s
558:	learn: 0.3948224	total: 1m 16s	remaining: 2m 8s
559:	learn: 0.3947512	total: 1m 16s	remaining: 2m 8s
560:	learn: 0.3947281	total: 1m 16s	remaining: 2m 8s
561:	learn: 0.3946670	total: 1m 16s	remaining: 2m 8s
562:	learn: 0.3946170	total: 1m 16s	remaining: 2m 7s
563:	learn: 0.3945699	total: 1m 16s	remaining: 2m 7s
564:	learn: 0.3945318	total: 1m 17s	remaining: 2m 7s
565:	learn: 0.3944518	total: 1m 17s	remaining: 2m 7s
566:	learn: 0.3943603	total: 1m 17s	remaining: 2m 7s
567:	learn: 0.3943041	total: 1m 17s	remaining: 2m 7s
568:	learn: 0.3942488	total: 1m 17s	remaining: 2m 7s
569:	learn: 0.3941921	total: 1m 17s	remaining: 2m 6s
570:	learn: 0.3941315	total: 1m 17s	remaining: 2m 6s
571:	learn: 0.3940830	total: 1m 18s	remaining: 2m 6s
572:	learn: 0.3940367	total: 1m 18s	remaining: 2m 6s
573:	learn: 0.3939674	total: 1m 18s	remaining: 2m 6s
574:	learn: 0.3939087	total: 1m 18s	remaining: 2m 6s
575:	learn: 0.3938568	total: 1m 18s	remaining: 2m 6s
576:	learn: 0.3937798	total: 1m 18s	remaining: 2m 5s
577:	learn: 0.3937258	total: 1m 18s	remaining: 2m 5s
578:	learn: 0.3936605	total: 1m 19s	remaining: 2m 5s
579:	learn: 0.3935990	total: 1m 19s	remaining: 2m 5s
580:	learn: 0.3935406	total: 1m 19s	remaining: 2m 5s
581:	learn: 0.3934862	total: 1m 19s	remaining: 2m 5s
582:	learn: 0.3934287	total: 1m 19s	remaining: 2m 5s
583:	learn: 0.3933516	total: 1m 19s	remaining: 2m 4s
584:	learn: 0.3932932	total: 1m 19s	remaining: 2m 4s
585:	learn: 0.3932388	total: 1m 19s	remaining: 2m 4s
586:	learn: 0.3931713	total: 1m 20s	remaining: 2m 4s
587:	learn: 0.3931169	total: 1m 20s	remaining: 2m 4s
588:	learn: 0.3930562	total: 1m 20s	remaining: 2m 4s
589:	learn: 0.3929932	total: 1m 20s	remaining: 2m 4s
590:	learn: 0.3929141	total: 1m 20s	remaining: 2m 4s
591:	learn: 0.3928589	total: 1m 20s	remaining: 2m 4s
592:	learn: 0.3928216	total: 1m 21s	remaining: 2m 3s
593:	learn: 0.3927418	total: 1m 21s	remaining: 2m 3s
594:	learn: 0.3926848	total: 1m 21s	remaining: 2m 3s
595:	learn: 0.3926377	total: 1m 21s	remaining: 2m 3s
596:	learn: 0.3925676	total: 1m 21s	remaining: 2m 3s
597:	learn: 0.3925116	total: 1m 21s	remaining: 2m 3s
598:	learn: 0.3924573	total: 1m 22s	remaining: 2m 3s
599:	learn: 0.3924013	total: 1m 22s	remaining: 2m 3s
600:	learn: 0.3923509	total: 1m 22s	remaining: 2m 3s
601:	learn: 0.3922893	total: 1m 22s	remaining: 2m 3s
602:	learn: 0.3922412	total: 1m 22s	remaining: 2m 2s
603:	learn: 0.3921981	total: 1m 22s	remaining: 2m 2s
604:	learn: 0.3921433	total: 1m 22s	remaining: 2m 2s
605:	learn: 0.3920991	total: 1m 22s	remaining: 2m 2s
606:	learn: 0.3920410	total: 1m 23s	remaining: 2m 2s
607:	learn: 0.3919920	total: 1m 23s	remaining: 2m 2s
608:	learn: 0.3919395	total: 1m 23s	remaining: 2m 1s
609:	learn: 0.3918971	total: 1m 23s	remaining: 2m 1s
610:	learn: 0.3918389	total: 1m 23s	remaining: 2m 1s
611:	learn: 0.3917829	total: 1m 23s	remaining: 2m 1s
612:	learn: 0.3917298	total: 1m 23s	remaining: 2m 1s
613:	learn: 0.3916712	total: 1m 24s	remaining: 2m 1s
614:	learn: 0.3916217	total: 1m 24s	remaining: 2m 1s
615:	learn: 0.3915865	total: 1m 24s	remaining: 2m
616:	learn: 0.3915449	total: 1m 24s	remaining: 2m
617:	learn: 0.3914777	total: 1m 24s	remaining: 2m
618:	learn: 0.3914176	total: 1m 24s	remaining: 2m
619:	learn: 0.3913754	total: 1m 24s	remaining: 2m
620:	learn: 0.3913364	total: 1m 24s	remaining: 2m
621:	learn: 0.3912713	total: 1m 25s	remaining: 2m
622:	learn: 0.3912099	total: 1m 25s	remaining: 1m 59s
623:	learn: 0.3911808	total: 1m 25s	remaining: 1m 59s
624:	learn: 0.3911115	total: 1m 25s	remaining: 1m 59s
625:	learn: 0.3910676	total: 1m 25s	remaining: 1m 59s
626:	learn: 0.3910030	total: 1m 25s	remaining: 1m 59s
627:	learn: 0.3909758	total: 1m 25s	remaining: 1m 59s
628:	learn: 0.3909426	total: 1m 25s	remaining: 1m 59s
629:	learn: 0.3909034	total: 1m 26s	remaining: 1m 58s
630:	learn: 0.3908440	total: 1m 26s	remaining: 1m 58s
631:	learn: 0.3907892	total: 1m 26s	remaining: 1m 58s
632:	learn: 0.3907264	total: 1m 26s	remaining: 1m 58s
633:	learn: 0.3906719	total: 1m 26s	remaining: 1m 58s
634:	learn: 0.3906087	total: 1m 26s	remaining: 1m 58s
635:	learn: 0.3905623	total: 1m 26s	remaining: 1m 58s
636:	learn: 0.3905215	total: 1m 27s	remaining: 1m 57s
637:	learn: 0.3904772	total: 1m 27s	remaining: 1m 57s
638:	learn: 0.3904261	total: 1m 27s	remaining: 1m 57s
639:	learn: 0.3903706	total: 1m 27s	remaining: 1m 57s
640:	learn: 0.3903227	total: 1m 27s	remaining: 1m 57s
641:	learn: 0.3902540	total: 1m 27s	remaining: 1m 57s
642:	learn: 0.3901976	total: 1m 27s	remaining: 1m 56s
643:	learn: 0.3901513	total: 1m 27s	remaining: 1m 56s
644:	learn: 0.3900959	total: 1m 28s	remaining: 1m 56s
645:	learn: 0.3900383	total: 1m 28s	remaining: 1m 56s
646:	learn: 0.3899865	total: 1m 28s	remaining: 1m 56s
647:	learn: 0.3899645	total: 1m 28s	remaining: 1m 56s
648:	learn: 0.3899366	total: 1m 28s	remaining: 1m 56s
649:	learn: 0.3898787	total: 1m 28s	remaining: 1m 55s
650:	learn: 0.3898394	total: 1m 28s	remaining: 1m 55s
651:	learn: 0.3897709	total: 1m 28s	remaining: 1m 55s
652:	learn: 0.3897222	total: 1m 29s	remaining: 1m 55s
653:	learn: 0.3896661	total: 1m 29s	remaining: 1m 55s
654:	learn: 0.3896201	total: 1m 29s	remaining: 1m 55s
655:	learn: 0.3895869	total: 1m 29s	remaining: 1m 55s
656:	learn: 0.3895409	total: 1m 29s	remaining: 1m 54s
657:	learn: 0.3895067	total: 1m 29s	remaining: 1m 54s
658:	learn: 0.3894597	total: 1m 29s	remaining: 1m 54s
659:	learn: 0.3893939	total: 1m 29s	remaining: 1m 54s
660:	learn: 0.3893446	total: 1m 30s	remaining: 1m 54s
661:	learn: 0.3893076	total: 1m 30s	remaining: 1m 54s
662:	learn: 0.3892436	total: 1m 30s	remaining: 1m 54s
663:	learn: 0.3891902	total: 1m 30s	remaining: 1m 53s
664:	learn: 0.3891343	total: 1m 30s	remaining: 1m 53s
665:	learn: 0.3890642	total: 1m 30s	remaining: 1m 53s
666:	learn: 0.3890262	total: 1m 30s	remaining: 1m 53s
667:	learn: 0.3889695	total: 1m 30s	remaining: 1m 53s
668:	learn: 0.3889231	total: 1m 31s	remaining: 1m 53s
669:	learn: 0.3888694	total: 1m 31s	remaining: 1m 53s
670:	learn: 0.3888154	total: 1m 31s	remaining: 1m 52s
671:	learn: 0.3887629	total: 1m 31s	remaining: 1m 52s
672:	learn: 0.3887079	total: 1m 31s	remaining: 1m 52s
673:	learn: 0.3886541	total: 1m 31s	remaining: 1m 52s
674:	learn: 0.3885993	total: 1m 31s	remaining: 1m 52s
675:	learn: 0.3885519	total: 1m 32s	remaining: 1m 52s
676:	learn: 0.3885004	total: 1m 32s	remaining: 1m 52s
677:	learn: 0.3884413	total: 1m 32s	remaining: 1m 51s
678:	learn: 0.3883279	total: 1m 32s	remaining: 1m 51s
679:	learn: 0.3882908	total: 1m 32s	remaining: 1m 51s
680:	learn: 0.3882406	total: 1m 32s	remaining: 1m 51s
681:	learn: 0.3881992	total: 1m 32s	remaining: 1m 51s
682:	learn: 0.3881409	total: 1m 32s	remaining: 1m 51s
683:	learn: 0.3880967	total: 1m 33s	remaining: 1m 51s
684:	learn: 0.3880401	total: 1m 33s	remaining: 1m 50s
685:	learn: 0.3879746	total: 1m 33s	remaining: 1m 50s
686:	learn: 0.3879465	total: 1m 33s	remaining: 1m 50s
687:	learn: 0.3878961	total: 1m 33s	remaining: 1m 50s
688:	learn: 0.3878448	total: 1m 33s	remaining: 1m 50s
689:	learn: 0.3877892	total: 1m 33s	remaining: 1m 50s
690:	learn: 0.3877522	total: 1m 33s	remaining: 1m 50s
691:	learn: 0.3877041	total: 1m 34s	remaining: 1m 49s
692:	learn: 0.3876465	total: 1m 34s	remaining: 1m 49s
693:	learn: 0.3875941	total: 1m 34s	remaining: 1m 49s
694:	learn: 0.3875477	total: 1m 34s	remaining: 1m 49s
695:	learn: 0.3874880	total: 1m 34s	remaining: 1m 49s
696:	learn: 0.3874385	total: 1m 34s	remaining: 1m 49s
697:	learn: 0.3873833	total: 1m 34s	remaining: 1m 49s
698:	learn: 0.3873258	total: 1m 35s	remaining: 1m 48s
699:	learn: 0.3872725	total: 1m 35s	remaining: 1m 48s
700:	learn: 0.3872312	total: 1m 35s	remaining: 1m 48s
701:	learn: 0.3871687	total: 1m 35s	remaining: 1m 48s
702:	learn: 0.3871140	total: 1m 35s	remaining: 1m 48s
703:	learn: 0.3870602	total: 1m 35s	remaining: 1m 48s
704:	learn: 0.3870025	total: 1m 35s	remaining: 1m 48s
705:	learn: 0.3869656	total: 1m 36s	remaining: 1m 47s
706:	learn: 0.3869047	total: 1m 36s	remaining: 1m 47s
707:	learn: 0.3868277	total: 1m 36s	remaining: 1m 47s
708:	learn: 0.3867953	total: 1m 36s	remaining: 1m 47s
709:	learn: 0.3867461	total: 1m 36s	remaining: 1m 47s
710:	learn: 0.3866811	total: 1m 36s	remaining: 1m 47s
711:	learn: 0.3866542	total: 1m 36s	remaining: 1m 47s
712:	learn: 0.3865932	total: 1m 37s	remaining: 1m 47s
713:	learn: 0.3865451	total: 1m 37s	remaining: 1m 47s
714:	learn: 0.3865045	total: 1m 37s	remaining: 1m 46s
715:	learn: 0.3864432	total: 1m 37s	remaining: 1m 46s
716:	learn: 0.3863715	total: 1m 37s	remaining: 1m 46s
717:	learn: 0.3863487	total: 1m 37s	remaining: 1m 46s
718:	learn: 0.3862952	total: 1m 38s	remaining: 1m 46s
719:	learn: 0.3862361	total: 1m 38s	remaining: 1m 46s
720:	learn: 0.3861896	total: 1m 38s	remaining: 1m 46s
721:	learn: 0.3861336	total: 1m 38s	remaining: 1m 46s
722:	learn: 0.3860847	total: 1m 38s	remaining: 1m 46s
723:	learn: 0.3860448	total: 1m 38s	remaining: 1m 45s
724:	learn: 0.3859898	total: 1m 38s	remaining: 1m 45s
725:	learn: 0.3859334	total: 1m 39s	remaining: 1m 45s
726:	learn: 0.3858798	total: 1m 39s	remaining: 1m 45s
727:	learn: 0.3858475	total: 1m 39s	remaining: 1m 45s
728:	learn: 0.3857945	total: 1m 39s	remaining: 1m 45s
729:	learn: 0.3857465	total: 1m 39s	remaining: 1m 44s
730:	learn: 0.3856986	total: 1m 39s	remaining: 1m 44s
731:	learn: 0.3856405	total: 1m 39s	remaining: 1m 44s
732:	learn: 0.3855916	total: 1m 39s	remaining: 1m 44s
733:	learn: 0.3855284	total: 1m 40s	remaining: 1m 44s
734:	learn: 0.3854706	total: 1m 40s	remaining: 1m 44s
735:	learn: 0.3853980	total: 1m 40s	remaining: 1m 44s
736:	learn: 0.3853431	total: 1m 40s	remaining: 1m 44s
737:	learn: 0.3853121	total: 1m 40s	remaining: 1m 43s
738:	learn: 0.3852384	total: 1m 40s	remaining: 1m 43s
739:	learn: 0.3851875	total: 1m 40s	remaining: 1m 43s
740:	learn: 0.3851296	total: 1m 41s	remaining: 1m 43s
741:	learn: 0.3850833	total: 1m 41s	remaining: 1m 43s
742:	learn: 0.3850208	total: 1m 41s	remaining: 1m 43s
743:	learn: 0.3849662	total: 1m 41s	remaining: 1m 43s
744:	learn: 0.3849100	total: 1m 41s	remaining: 1m 43s
745:	learn: 0.3848718	total: 1m 41s	remaining: 1m 42s
746:	learn: 0.3848315	total: 1m 41s	remaining: 1m 42s
747:	learn: 0.3847829	total: 1m 42s	remaining: 1m 42s
748:	learn: 0.3847464	total: 1m 42s	remaining: 1m 42s
749:	learn: 0.3847059	total: 1m 42s	remaining: 1m 42s
750:	learn: 0.3846518	total: 1m 42s	remaining: 1m 42s
751:	learn: 0.3845902	total: 1m 42s	remaining: 1m 42s
752:	learn: 0.3845410	total: 1m 42s	remaining: 1m 41s
753:	learn: 0.3844534	total: 1m 42s	remaining: 1m 41s
754:	learn: 0.3844013	total: 1m 42s	remaining: 1m 41s
755:	learn: 0.3843575	total: 1m 43s	remaining: 1m 41s
756:	learn: 0.3842989	total: 1m 43s	remaining: 1m 41s
757:	learn: 0.3842533	total: 1m 43s	remaining: 1m 41s
758:	learn: 0.3841960	total: 1m 43s	remaining: 1m 41s
759:	learn: 0.3841485	total: 1m 43s	remaining: 1m 40s
760:	learn: 0.3841065	total: 1m 43s	remaining: 1m 40s
761:	learn: 0.3840530	total: 1m 43s	remaining: 1m 40s
762:	learn: 0.3840048	total: 1m 44s	remaining: 1m 40s
763:	learn: 0.3839531	total: 1m 44s	remaining: 1m 40s
764:	learn: 0.3839168	total: 1m 44s	remaining: 1m 40s
765:	learn: 0.3838816	total: 1m 44s	remaining: 1m 40s
766:	learn: 0.3838453	total: 1m 44s	remaining: 1m 39s
767:	learn: 0.3837915	total: 1m 44s	remaining: 1m 39s
768:	learn: 0.3837486	total: 1m 44s	remaining: 1m 39s
769:	learn: 0.3837099	total: 1m 44s	remaining: 1m 39s
770:	learn: 0.3836625	total: 1m 45s	remaining: 1m 39s
771:	learn: 0.3836266	total: 1m 45s	remaining: 1m 39s
772:	learn: 0.3835743	total: 1m 45s	remaining: 1m 39s
773:	learn: 0.3835333	total: 1m 45s	remaining: 1m 38s
774:	learn: 0.3834776	total: 1m 45s	remaining: 1m 38s
775:	learn: 0.3834241	total: 1m 45s	remaining: 1m 38s
776:	learn: 0.3833601	total: 1m 45s	remaining: 1m 38s
777:	learn: 0.3833083	total: 1m 46s	remaining: 1m 38s
778:	learn: 0.3832502	total: 1m 46s	remaining: 1m 38s
779:	learn: 0.3831859	total: 1m 46s	remaining: 1m 38s
780:	learn: 0.3831371	total: 1m 46s	remaining: 1m 38s
781:	learn: 0.3830982	total: 1m 46s	remaining: 1m 37s
782:	learn: 0.3830387	total: 1m 46s	remaining: 1m 37s
783:	learn: 0.3829861	total: 1m 46s	remaining: 1m 37s
784:	learn: 0.3829258	total: 1m 47s	remaining: 1m 37s
785:	learn: 0.3828731	total: 1m 47s	remaining: 1m 37s
786:	learn: 0.3828139	total: 1m 47s	remaining: 1m 37s
787:	learn: 0.3827589	total: 1m 47s	remaining: 1m 37s
788:	learn: 0.3827166	total: 1m 47s	remaining: 1m 36s
789:	learn: 0.3826700	total: 1m 47s	remaining: 1m 36s
790:	learn: 0.3826393	total: 1m 47s	remaining: 1m 36s
791:	learn: 0.3826010	total: 1m 47s	remaining: 1m 36s
792:	learn: 0.3825635	total: 1m 48s	remaining: 1m 36s
793:	learn: 0.3825097	total: 1m 48s	remaining: 1m 36s
794:	learn: 0.3824680	total: 1m 48s	remaining: 1m 36s
795:	learn: 0.3824261	total: 1m 48s	remaining: 1m 35s
796:	learn: 0.3823572	total: 1m 48s	remaining: 1m 35s
797:	learn: 0.3823105	total: 1m 48s	remaining: 1m 35s
798:	learn: 0.3822708	total: 1m 48s	remaining: 1m 35s
799:	learn: 0.3822176	total: 1m 49s	remaining: 1m 35s
800:	learn: 0.3821735	total: 1m 49s	remaining: 1m 35s
801:	learn: 0.3821191	total: 1m 49s	remaining: 1m 35s
802:	learn: 0.3820569	total: 1m 49s	remaining: 1m 34s
803:	learn: 0.3820152	total: 1m 49s	remaining: 1m 34s
804:	learn: 0.3819768	total: 1m 49s	remaining: 1m 34s
805:	learn: 0.3819278	total: 1m 49s	remaining: 1m 34s
806:	learn: 0.3818781	total: 1m 49s	remaining: 1m 34s
807:	learn: 0.3818278	total: 1m 50s	remaining: 1m 34s
808:	learn: 0.3817719	total: 1m 50s	remaining: 1m 34s
809:	learn: 0.3817172	total: 1m 50s	remaining: 1m 33s
810:	learn: 0.3816625	total: 1m 50s	remaining: 1m 33s
811:	learn: 0.3816084	total: 1m 50s	remaining: 1m 33s
812:	learn: 0.3815532	total: 1m 50s	remaining: 1m 33s
813:	learn: 0.3815007	total: 1m 50s	remaining: 1m 33s
814:	learn: 0.3814642	total: 1m 51s	remaining: 1m 33s
815:	learn: 0.3814251	total: 1m 51s	remaining: 1m 33s
816:	learn: 0.3813769	total: 1m 51s	remaining: 1m 33s
817:	learn: 0.3813351	total: 1m 51s	remaining: 1m 32s
818:	learn: 0.3813100	total: 1m 51s	remaining: 1m 32s
819:	learn: 0.3812602	total: 1m 51s	remaining: 1m 32s
820:	learn: 0.3812159	total: 1m 51s	remaining: 1m 32s
821:	learn: 0.3811881	total: 1m 51s	remaining: 1m 32s
822:	learn: 0.3811569	total: 1m 52s	remaining: 1m 32s
823:	learn: 0.3811261	total: 1m 52s	remaining: 1m 32s
824:	learn: 0.3810933	total: 1m 52s	remaining: 1m 31s
825:	learn: 0.3810440	total: 1m 52s	remaining: 1m 31s
826:	learn: 0.3810058	total: 1m 52s	remaining: 1m 31s
827:	learn: 0.3809535	total: 1m 52s	remaining: 1m 31s
828:	learn: 0.3809060	total: 1m 53s	remaining: 1m 31s
829:	learn: 0.3808510	total: 1m 53s	remaining: 1m 31s
830:	learn: 0.3807972	total: 1m 53s	remaining: 1m 31s
831:	learn: 0.3807545	total: 1m 53s	remaining: 1m 31s
832:	learn: 0.3807085	total: 1m 53s	remaining: 1m 31s
833:	learn: 0.3806723	total: 1m 53s	remaining: 1m 30s
834:	learn: 0.3806248	total: 1m 54s	remaining: 1m 30s
835:	learn: 0.3805660	total: 1m 54s	remaining: 1m 30s
836:	learn: 0.3805168	total: 1m 54s	remaining: 1m 30s
837:	learn: 0.3804715	total: 1m 54s	remaining: 1m 30s
838:	learn: 0.3804368	total: 1m 54s	remaining: 1m 30s
839:	learn: 0.3803882	total: 1m 54s	remaining: 1m 30s
840:	learn: 0.3803303	total: 1m 54s	remaining: 1m 30s
841:	learn: 0.3802810	total: 1m 55s	remaining: 1m 29s
842:	learn: 0.3802282	total: 1m 55s	remaining: 1m 29s
843:	learn: 0.3802054	total: 1m 55s	remaining: 1m 29s
844:	learn: 0.3801507	total: 1m 55s	remaining: 1m 29s
845:	learn: 0.3801140	total: 1m 55s	remaining: 1m 29s
846:	learn: 0.3800673	total: 1m 55s	remaining: 1m 29s
847:	learn: 0.3800163	total: 1m 55s	remaining: 1m 29s
848:	learn: 0.3799798	total: 1m 56s	remaining: 1m 29s
849:	learn: 0.3799404	total: 1m 56s	remaining: 1m 28s
850:	learn: 0.3798754	total: 1m 56s	remaining: 1m 28s
851:	learn: 0.3798319	total: 1m 56s	remaining: 1m 28s
852:	learn: 0.3797724	total: 1m 56s	remaining: 1m 28s
853:	learn: 0.3797327	total: 1m 56s	remaining: 1m 28s
854:	learn: 0.3796789	total: 1m 56s	remaining: 1m 28s
855:	learn: 0.3796216	total: 1m 57s	remaining: 1m 28s
856:	learn: 0.3795688	total: 1m 57s	remaining: 1m 27s
857:	learn: 0.3795222	total: 1m 57s	remaining: 1m 27s
858:	learn: 0.3794966	total: 1m 57s	remaining: 1m 27s
859:	learn: 0.3794460	total: 1m 57s	remaining: 1m 27s
860:	learn: 0.3794094	total: 1m 57s	remaining: 1m 27s
861:	learn: 0.3793763	total: 1m 57s	remaining: 1m 27s
862:	learn: 0.3793228	total: 1m 57s	remaining: 1m 27s
863:	learn: 0.3792845	total: 1m 58s	remaining: 1m 26s
864:	learn: 0.3792309	total: 1m 58s	remaining: 1m 26s
865:	learn: 0.3791741	total: 1m 58s	remaining: 1m 26s
866:	learn: 0.3791377	total: 1m 58s	remaining: 1m 26s
867:	learn: 0.3790780	total: 1m 58s	remaining: 1m 26s
868:	learn: 0.3790399	total: 1m 58s	remaining: 1m 26s
869:	learn: 0.3790013	total: 1m 58s	remaining: 1m 26s
870:	learn: 0.3789610	total: 1m 59s	remaining: 1m 26s
871:	learn: 0.3789129	total: 1m 59s	remaining: 1m 25s
872:	learn: 0.3788803	total: 1m 59s	remaining: 1m 25s
873:	learn: 0.3788454	total: 1m 59s	remaining: 1m 25s
874:	learn: 0.3788096	total: 1m 59s	remaining: 1m 25s
875:	learn: 0.3787550	total: 1m 59s	remaining: 1m 25s
876:	learn: 0.3787143	total: 1m 59s	remaining: 1m 25s
877:	learn: 0.3786600	total: 2m	remaining: 1m 25s
878:	learn: 0.3786222	total: 2m	remaining: 1m 24s
879:	learn: 0.3785857	total: 2m	remaining: 1m 24s
880:	learn: 0.3785489	total: 2m	remaining: 1m 24s
881:	learn: 0.3784970	total: 2m	remaining: 1m 24s
882:	learn: 0.3784593	total: 2m	remaining: 1m 24s
883:	learn: 0.3783997	total: 2m	remaining: 1m 24s
884:	learn: 0.3783521	total: 2m	remaining: 1m 24s
885:	learn: 0.3783041	total: 2m 1s	remaining: 1m 23s
886:	learn: 0.3782589	total: 2m 1s	remaining: 1m 23s
887:	learn: 0.3782111	total: 2m 1s	remaining: 1m 23s
888:	learn: 0.3781636	total: 2m 1s	remaining: 1m 23s
889:	learn: 0.3781002	total: 2m 1s	remaining: 1m 23s
890:	learn: 0.3780578	total: 2m 1s	remaining: 1m 23s
891:	learn: 0.3780034	total: 2m 1s	remaining: 1m 23s
892:	learn: 0.3779708	total: 2m 2s	remaining: 1m 22s
893:	learn: 0.3779122	total: 2m 2s	remaining: 1m 22s
894:	learn: 0.3778562	total: 2m 2s	remaining: 1m 22s
895:	learn: 0.3778086	total: 2m 2s	remaining: 1m 22s
896:	learn: 0.3777577	total: 2m 2s	remaining: 1m 22s
897:	learn: 0.3777094	total: 2m 2s	remaining: 1m 22s
898:	learn: 0.3776546	total: 2m 2s	remaining: 1m 22s
899:	learn: 0.3776142	total: 2m 3s	remaining: 1m 22s
900:	learn: 0.3775840	total: 2m 3s	remaining: 1m 21s
901:	learn: 0.3775436	total: 2m 3s	remaining: 1m 21s
902:	learn: 0.3774964	total: 2m 3s	remaining: 1m 21s
903:	learn: 0.3774419	total: 2m 3s	remaining: 1m 21s
904:	learn: 0.3773950	total: 2m 3s	remaining: 1m 21s
905:	learn: 0.3773374	total: 2m 3s	remaining: 1m 21s
906:	learn: 0.3772908	total: 2m 4s	remaining: 1m 21s
907:	learn: 0.3772455	total: 2m 4s	remaining: 1m 20s
908:	learn: 0.3771963	total: 2m 4s	remaining: 1m 20s
909:	learn: 0.3771502	total: 2m 4s	remaining: 1m 20s
910:	learn: 0.3771054	total: 2m 4s	remaining: 1m 20s
911:	learn: 0.3770648	total: 2m 4s	remaining: 1m 20s
912:	learn: 0.3770302	total: 2m 4s	remaining: 1m 20s
913:	learn: 0.3769854	total: 2m 4s	remaining: 1m 20s
914:	learn: 0.3769472	total: 2m 5s	remaining: 1m 19s
915:	learn: 0.3769032	total: 2m 5s	remaining: 1m 19s
916:	learn: 0.3768501	total: 2m 5s	remaining: 1m 19s
917:	learn: 0.3768138	total: 2m 5s	remaining: 1m 19s
918:	learn: 0.3767561	total: 2m 5s	remaining: 1m 19s
919:	learn: 0.3767138	total: 2m 5s	remaining: 1m 19s
920:	learn: 0.3766646	total: 2m 5s	remaining: 1m 19s
921:	learn: 0.3766409	total: 2m 6s	remaining: 1m 19s
922:	learn: 0.3766066	total: 2m 6s	remaining: 1m 18s
923:	learn: 0.3765623	total: 2m 6s	remaining: 1m 18s
924:	learn: 0.3765152	total: 2m 6s	remaining: 1m 18s
925:	learn: 0.3764705	total: 2m 6s	remaining: 1m 18s
926:	learn: 0.3764187	total: 2m 6s	remaining: 1m 18s
927:	learn: 0.3763690	total: 2m 6s	remaining: 1m 18s
928:	learn: 0.3763167	total: 2m 6s	remaining: 1m 18s
929:	learn: 0.3762654	total: 2m 7s	remaining: 1m 17s
930:	learn: 0.3762146	total: 2m 7s	remaining: 1m 17s
931:	learn: 0.3761636	total: 2m 7s	remaining: 1m 17s
932:	learn: 0.3761123	total: 2m 7s	remaining: 1m 17s
933:	learn: 0.3760586	total: 2m 7s	remaining: 1m 17s
934:	learn: 0.3760211	total: 2m 7s	remaining: 1m 17s
935:	learn: 0.3759685	total: 2m 8s	remaining: 1m 17s
936:	learn: 0.3759208	total: 2m 8s	remaining: 1m 17s
937:	learn: 0.3758772	total: 2m 8s	remaining: 1m 16s
938:	learn: 0.3758528	total: 2m 8s	remaining: 1m 16s
939:	learn: 0.3758030	total: 2m 8s	remaining: 1m 16s
940:	learn: 0.3757606	total: 2m 8s	remaining: 1m 16s
941:	learn: 0.3757173	total: 2m 8s	remaining: 1m 16s
942:	learn: 0.3756847	total: 2m 9s	remaining: 1m 16s
943:	learn: 0.3756504	total: 2m 9s	remaining: 1m 16s
944:	learn: 0.3756086	total: 2m 9s	remaining: 1m 16s
945:	learn: 0.3755667	total: 2m 9s	remaining: 1m 15s
946:	learn: 0.3755237	total: 2m 9s	remaining: 1m 15s
947:	learn: 0.3754864	total: 2m 9s	remaining: 1m 15s
948:	learn: 0.3754533	total: 2m 10s	remaining: 1m 15s
949:	learn: 0.3754182	total: 2m 10s	remaining: 1m 15s
950:	learn: 0.3753853	total: 2m 10s	remaining: 1m 15s
951:	learn: 0.3753374	total: 2m 10s	remaining: 1m 15s
952:	learn: 0.3753049	total: 2m 10s	remaining: 1m 14s
953:	learn: 0.3752777	total: 2m 10s	remaining: 1m 14s
954:	learn: 0.3752370	total: 2m 10s	remaining: 1m 14s
955:	learn: 0.3751915	total: 2m 11s	remaining: 1m 14s
956:	learn: 0.3751478	total: 2m 11s	remaining: 1m 14s
957:	learn: 0.3751096	total: 2m 11s	remaining: 1m 14s
958:	learn: 0.3750560	total: 2m 11s	remaining: 1m 14s
959:	learn: 0.3750012	total: 2m 11s	remaining: 1m 14s
960:	learn: 0.3749600	total: 2m 11s	remaining: 1m 13s
961:	learn: 0.3749164	total: 2m 11s	remaining: 1m 13s
962:	learn: 0.3748591	total: 2m 12s	remaining: 1m 13s
963:	learn: 0.3748131	total: 2m 12s	remaining: 1m 13s
964:	learn: 0.3747663	total: 2m 12s	remaining: 1m 13s
965:	learn: 0.3747172	total: 2m 12s	remaining: 1m 13s
966:	learn: 0.3746493	total: 2m 12s	remaining: 1m 13s
967:	learn: 0.3746033	total: 2m 12s	remaining: 1m 12s
968:	learn: 0.3745561	total: 2m 12s	remaining: 1m 12s
969:	learn: 0.3744995	total: 2m 13s	remaining: 1m 12s
970:	learn: 0.3744633	total: 2m 13s	remaining: 1m 12s
971:	learn: 0.3744225	total: 2m 13s	remaining: 1m 12s
972:	learn: 0.3743637	total: 2m 13s	remaining: 1m 12s
973:	learn: 0.3743302	total: 2m 13s	remaining: 1m 12s
974:	learn: 0.3742857	total: 2m 13s	remaining: 1m 11s
975:	learn: 0.3742360	total: 2m 13s	remaining: 1m 11s
976:	learn: 0.3741986	total: 2m 13s	remaining: 1m 11s
977:	learn: 0.3741513	total: 2m 14s	remaining: 1m 11s
978:	learn: 0.3741077	total: 2m 14s	remaining: 1m 11s
979:	learn: 0.3740698	total: 2m 14s	remaining: 1m 11s
980:	learn: 0.3740295	total: 2m 14s	remaining: 1m 11s
981:	learn: 0.3739837	total: 2m 14s	remaining: 1m 11s
982:	learn: 0.3739430	total: 2m 14s	remaining: 1m 10s
983:	learn: 0.3739027	total: 2m 14s	remaining: 1m 10s
984:	learn: 0.3738527	total: 2m 15s	remaining: 1m 10s
985:	learn: 0.3738103	total: 2m 15s	remaining: 1m 10s
986:	learn: 0.3737626	total: 2m 15s	remaining: 1m 10s
987:	learn: 0.3737188	total: 2m 15s	remaining: 1m 10s
988:	learn: 0.3736660	total: 2m 15s	remaining: 1m 10s
989:	learn: 0.3736331	total: 2m 15s	remaining: 1m 9s
990:	learn: 0.3735849	total: 2m 15s	remaining: 1m 9s
991:	learn: 0.3735359	total: 2m 16s	remaining: 1m 9s
992:	learn: 0.3735022	total: 2m 16s	remaining: 1m 9s
993:	learn: 0.3734569	total: 2m 16s	remaining: 1m 9s
994:	learn: 0.3734196	total: 2m 16s	remaining: 1m 9s
995:	learn: 0.3733780	total: 2m 16s	remaining: 1m 9s
996:	learn: 0.3733265	total: 2m 16s	remaining: 1m 8s
997:	learn: 0.3732809	total: 2m 16s	remaining: 1m 8s
998:	learn: 0.3732357	total: 2m 17s	remaining: 1m 8s
999:	learn: 0.3731870	total: 2m 17s	remaining: 1m 8s
1000:	learn: 0.3731532	total: 2m 17s	remaining: 1m 8s
1001:	learn: 0.3731024	total: 2m 17s	remaining: 1m 8s
1002:	learn: 0.3730537	total: 2m 17s	remaining: 1m 8s
1003:	learn: 0.3730069	total: 2m 17s	remaining: 1m 8s
1004:	learn: 0.3729617	total: 2m 17s	remaining: 1m 7s
1005:	learn: 0.3729182	total: 2m 18s	remaining: 1m 7s
1006:	learn: 0.3728847	total: 2m 18s	remaining: 1m 7s
1007:	learn: 0.3728410	total: 2m 18s	remaining: 1m 7s
1008:	learn: 0.3727980	total: 2m 18s	remaining: 1m 7s
1009:	learn: 0.3727398	total: 2m 18s	remaining: 1m 7s
1010:	learn: 0.3727057	total: 2m 18s	remaining: 1m 7s
1011:	learn: 0.3726560	total: 2m 18s	remaining: 1m 6s
1012:	learn: 0.3726027	total: 2m 18s	remaining: 1m 6s
1013:	learn: 0.3725541	total: 2m 19s	remaining: 1m 6s
1014:	learn: 0.3725154	total: 2m 19s	remaining: 1m 6s
1015:	learn: 0.3724672	total: 2m 19s	remaining: 1m 6s
1016:	learn: 0.3724221	total: 2m 19s	remaining: 1m 6s
1017:	learn: 0.3723828	total: 2m 19s	remaining: 1m 6s
1018:	learn: 0.3723364	total: 2m 20s	remaining: 1m 6s
1019:	learn: 0.3722822	total: 2m 20s	remaining: 1m 6s
1020:	learn: 0.3722390	total: 2m 20s	remaining: 1m 5s
1021:	learn: 0.3722017	total: 2m 20s	remaining: 1m 5s
1022:	learn: 0.3721676	total: 2m 20s	remaining: 1m 5s
1023:	learn: 0.3721207	total: 2m 20s	remaining: 1m 5s
1024:	learn: 0.3720736	total: 2m 21s	remaining: 1m 5s
1025:	learn: 0.3720269	total: 2m 21s	remaining: 1m 5s
1026:	learn: 0.3719960	total: 2m 21s	remaining: 1m 5s
1027:	learn: 0.3719481	total: 2m 21s	remaining: 1m 5s
1028:	learn: 0.3718986	total: 2m 22s	remaining: 1m 5s
1029:	learn: 0.3718587	total: 2m 22s	remaining: 1m 4s
1030:	learn: 0.3718121	total: 2m 22s	remaining: 1m 4s
1031:	learn: 0.3717654	total: 2m 22s	remaining: 1m 4s
1032:	learn: 0.3717141	total: 2m 22s	remaining: 1m 4s
1033:	learn: 0.3716635	total: 2m 22s	remaining: 1m 4s
1034:	learn: 0.3716283	total: 2m 23s	remaining: 1m 4s
1035:	learn: 0.3715754	total: 2m 23s	remaining: 1m 4s
1036:	learn: 0.3715295	total: 2m 23s	remaining: 1m 4s
1037:	learn: 0.3714962	total: 2m 23s	remaining: 1m 4s
1038:	learn: 0.3714447	total: 2m 24s	remaining: 1m 3s
1039:	learn: 0.3714009	total: 2m 24s	remaining: 1m 3s
1040:	learn: 0.3713633	total: 2m 24s	remaining: 1m 3s
1041:	learn: 0.3713314	total: 2m 24s	remaining: 1m 3s
1042:	learn: 0.3712981	total: 2m 24s	remaining: 1m 3s
1043:	learn: 0.3712492	total: 2m 24s	remaining: 1m 3s
1044:	learn: 0.3711985	total: 2m 25s	remaining: 1m 3s
1045:	learn: 0.3711606	total: 2m 25s	remaining: 1m 3s
1046:	learn: 0.3711225	total: 2m 25s	remaining: 1m 2s
1047:	learn: 0.3710685	total: 2m 25s	remaining: 1m 2s
1048:	learn: 0.3710288	total: 2m 25s	remaining: 1m 2s
1049:	learn: 0.3710008	total: 2m 25s	remaining: 1m 2s
1050:	learn: 0.3709592	total: 2m 26s	remaining: 1m 2s
1051:	learn: 0.3709122	total: 2m 26s	remaining: 1m 2s
1052:	learn: 0.3708691	total: 2m 26s	remaining: 1m 2s
1053:	learn: 0.3708276	total: 2m 26s	remaining: 1m 1s
1054:	learn: 0.3707956	total: 2m 26s	remaining: 1m 1s
1055:	learn: 0.3707497	total: 2m 26s	remaining: 1m 1s
1056:	learn: 0.3707017	total: 2m 26s	remaining: 1m 1s
1057:	learn: 0.3706482	total: 2m 27s	remaining: 1m 1s
1058:	learn: 0.3705851	total: 2m 27s	remaining: 1m 1s
1059:	learn: 0.3705349	total: 2m 27s	remaining: 1m 1s
1060:	learn: 0.3704965	total: 2m 27s	remaining: 1m 1s
1061:	learn: 0.3704542	total: 2m 27s	remaining: 1m
1062:	learn: 0.3704247	total: 2m 27s	remaining: 1m
1063:	learn: 0.3703825	total: 2m 28s	remaining: 1m
1064:	learn: 0.3703453	total: 2m 28s	remaining: 1m
1065:	learn: 0.3703053	total: 2m 28s	remaining: 1m
1066:	learn: 0.3702654	total: 2m 28s	remaining: 1m
1067:	learn: 0.3702279	total: 2m 28s	remaining: 1m
1068:	learn: 0.3701741	total: 2m 28s	remaining: 1m
1069:	learn: 0.3701301	total: 2m 28s	remaining: 59.9s
1070:	learn: 0.3700828	total: 2m 29s	remaining: 59.7s
1071:	learn: 0.3700362	total: 2m 29s	remaining: 59.6s
1072:	learn: 0.3699843	total: 2m 29s	remaining: 59.5s
1073:	learn: 0.3699305	total: 2m 29s	remaining: 59.3s
1074:	learn: 0.3698770	total: 2m 29s	remaining: 59.2s
1075:	learn: 0.3698392	total: 2m 29s	remaining: 59.1s
1076:	learn: 0.3697939	total: 2m 30s	remaining: 58.9s
1077:	learn: 0.3697504	total: 2m 30s	remaining: 58.8s
1078:	learn: 0.3697163	total: 2m 30s	remaining: 58.7s
1079:	learn: 0.3696845	total: 2m 30s	remaining: 58.5s
1080:	learn: 0.3696316	total: 2m 30s	remaining: 58.4s
1081:	learn: 0.3695958	total: 2m 30s	remaining: 58.2s
1082:	learn: 0.3695466	total: 2m 30s	remaining: 58.1s
1083:	learn: 0.3695103	total: 2m 30s	remaining: 57.9s
1084:	learn: 0.3694797	total: 2m 31s	remaining: 57.8s
1085:	learn: 0.3694576	total: 2m 31s	remaining: 57.7s
1086:	learn: 0.3694196	total: 2m 31s	remaining: 57.5s
1087:	learn: 0.3693764	total: 2m 31s	remaining: 57.4s
1088:	learn: 0.3693322	total: 2m 31s	remaining: 57.2s
1089:	learn: 0.3692897	total: 2m 31s	remaining: 57.1s
1090:	learn: 0.3692506	total: 2m 31s	remaining: 56.9s
1091:	learn: 0.3692196	total: 2m 32s	remaining: 56.8s
1092:	learn: 0.3691673	total: 2m 32s	remaining: 56.7s
1093:	learn: 0.3691178	total: 2m 32s	remaining: 56.5s
1094:	learn: 0.3690699	total: 2m 32s	remaining: 56.4s
1095:	learn: 0.3690358	total: 2m 32s	remaining: 56.2s
1096:	learn: 0.3689895	total: 2m 32s	remaining: 56.1s
1097:	learn: 0.3689566	total: 2m 32s	remaining: 55.9s
1098:	learn: 0.3689186	total: 2m 32s	remaining: 55.8s
1099:	learn: 0.3688814	total: 2m 33s	remaining: 55.7s
1100:	learn: 0.3688498	total: 2m 33s	remaining: 55.5s
1101:	learn: 0.3688058	total: 2m 33s	remaining: 55.4s
1102:	learn: 0.3687674	total: 2m 33s	remaining: 55.2s
1103:	learn: 0.3687186	total: 2m 33s	remaining: 55.1s
1104:	learn: 0.3686696	total: 2m 33s	remaining: 54.9s
1105:	learn: 0.3686309	total: 2m 33s	remaining: 54.8s
1106:	learn: 0.3686020	total: 2m 33s	remaining: 54.7s
1107:	learn: 0.3685492	total: 2m 34s	remaining: 54.5s
1108:	learn: 0.3684965	total: 2m 34s	remaining: 54.4s
1109:	learn: 0.3684466	total: 2m 34s	remaining: 54.2s
1110:	learn: 0.3684184	total: 2m 34s	remaining: 54.1s
1111:	learn: 0.3683655	total: 2m 34s	remaining: 53.9s
1112:	learn: 0.3683193	total: 2m 34s	remaining: 53.8s
1113:	learn: 0.3682814	total: 2m 34s	remaining: 53.7s
1114:	learn: 0.3682191	total: 2m 35s	remaining: 53.5s
1115:	learn: 0.3681796	total: 2m 35s	remaining: 53.4s
1116:	learn: 0.3681304	total: 2m 35s	remaining: 53.3s
1117:	learn: 0.3680822	total: 2m 35s	remaining: 53.1s
1118:	learn: 0.3680333	total: 2m 35s	remaining: 53s
1119:	learn: 0.3679971	total: 2m 35s	remaining: 52.8s
1120:	learn: 0.3679579	total: 2m 35s	remaining: 52.7s
1121:	learn: 0.3679136	total: 2m 35s	remaining: 52.5s
1122:	learn: 0.3678779	total: 2m 36s	remaining: 52.4s
1123:	learn: 0.3678357	total: 2m 36s	remaining: 52.3s
1124:	learn: 0.3677860	total: 2m 36s	remaining: 52.1s
1125:	learn: 0.3677438	total: 2m 36s	remaining: 52s
1126:	learn: 0.3676916	total: 2m 36s	remaining: 51.8s
1127:	learn: 0.3676449	total: 2m 36s	remaining: 51.7s
1128:	learn: 0.3676125	total: 2m 36s	remaining: 51.6s
1129:	learn: 0.3675647	total: 2m 37s	remaining: 51.4s
1130:	learn: 0.3675373	total: 2m 37s	remaining: 51.3s
1131:	learn: 0.3674929	total: 2m 37s	remaining: 51.1s
1132:	learn: 0.3674296	total: 2m 37s	remaining: 51s
1133:	learn: 0.3673933	total: 2m 37s	remaining: 50.9s
1134:	learn: 0.3673771	total: 2m 37s	remaining: 50.7s
1135:	learn: 0.3673272	total: 2m 37s	remaining: 50.6s
1136:	learn: 0.3672860	total: 2m 37s	remaining: 50.4s
1137:	learn: 0.3672551	total: 2m 38s	remaining: 50.3s
1138:	learn: 0.3672176	total: 2m 38s	remaining: 50.1s
1139:	learn: 0.3671717	total: 2m 38s	remaining: 50s
1140:	learn: 0.3671254	total: 2m 38s	remaining: 49.9s
1141:	learn: 0.3670798	total: 2m 38s	remaining: 49.7s
1142:	learn: 0.3670304	total: 2m 38s	remaining: 49.6s
1143:	learn: 0.3669864	total: 2m 38s	remaining: 49.4s
1144:	learn: 0.3669403	total: 2m 39s	remaining: 49.3s
1145:	learn: 0.3668919	total: 2m 39s	remaining: 49.2s
1146:	learn: 0.3668476	total: 2m 39s	remaining: 49s
1147:	learn: 0.3668027	total: 2m 39s	remaining: 48.9s
1148:	learn: 0.3667687	total: 2m 39s	remaining: 48.8s
1149:	learn: 0.3667167	total: 2m 39s	remaining: 48.6s
1150:	learn: 0.3666707	total: 2m 39s	remaining: 48.5s
1151:	learn: 0.3666246	total: 2m 40s	remaining: 48.4s
1152:	learn: 0.3665840	total: 2m 40s	remaining: 48.2s
1153:	learn: 0.3665396	total: 2m 40s	remaining: 48.1s
1154:	learn: 0.3664852	total: 2m 40s	remaining: 48s
1155:	learn: 0.3664366	total: 2m 40s	remaining: 47.8s
1156:	learn: 0.3663933	total: 2m 40s	remaining: 47.7s
1157:	learn: 0.3663545	total: 2m 41s	remaining: 47.6s
1158:	learn: 0.3663138	total: 2m 41s	remaining: 47.4s
1159:	learn: 0.3662786	total: 2m 41s	remaining: 47.3s
1160:	learn: 0.3662776	total: 2m 41s	remaining: 47.1s
1161:	learn: 0.3662474	total: 2m 41s	remaining: 47s
1162:	learn: 0.3662196	total: 2m 41s	remaining: 46.9s
1163:	learn: 0.3661912	total: 2m 41s	remaining: 46.7s
1164:	learn: 0.3661429	total: 2m 41s	remaining: 46.6s
1165:	learn: 0.3660898	total: 2m 42s	remaining: 46.4s
1166:	learn: 0.3660438	total: 2m 42s	remaining: 46.3s
1167:	learn: 0.3660197	total: 2m 42s	remaining: 46.2s
1168:	learn: 0.3659712	total: 2m 42s	remaining: 46s
1169:	learn: 0.3659173	total: 2m 42s	remaining: 45.9s
1170:	learn: 0.3658666	total: 2m 42s	remaining: 45.7s
1171:	learn: 0.3658217	total: 2m 42s	remaining: 45.6s
1172:	learn: 0.3657921	total: 2m 43s	remaining: 45.4s
1173:	learn: 0.3657510	total: 2m 43s	remaining: 45.3s
1174:	learn: 0.3657171	total: 2m 43s	remaining: 45.2s
1175:	learn: 0.3656946	total: 2m 43s	remaining: 45s
1176:	learn: 0.3656551	total: 2m 43s	remaining: 44.9s
1177:	learn: 0.3656183	total: 2m 43s	remaining: 44.7s
1178:	learn: 0.3655736	total: 2m 43s	remaining: 44.6s
1179:	learn: 0.3655277	total: 2m 43s	remaining: 44.5s
1180:	learn: 0.3655062	total: 2m 44s	remaining: 44.3s
1181:	learn: 0.3654685	total: 2m 44s	remaining: 44.2s
1182:	learn: 0.3654164	total: 2m 44s	remaining: 44s
1183:	learn: 0.3653751	total: 2m 44s	remaining: 43.9s
1184:	learn: 0.3653186	total: 2m 44s	remaining: 43.8s
1185:	learn: 0.3652829	total: 2m 44s	remaining: 43.6s
1186:	learn: 0.3652277	total: 2m 44s	remaining: 43.5s
1187:	learn: 0.3651782	total: 2m 45s	remaining: 43.3s
1188:	learn: 0.3651386	total: 2m 45s	remaining: 43.2s
1189:	learn: 0.3650898	total: 2m 45s	remaining: 43.1s
1190:	learn: 0.3650374	total: 2m 45s	remaining: 42.9s
1191:	learn: 0.3650052	total: 2m 45s	remaining: 42.8s
1192:	learn: 0.3649662	total: 2m 45s	remaining: 42.6s
1193:	learn: 0.3649227	total: 2m 45s	remaining: 42.5s
1194:	learn: 0.3648786	total: 2m 45s	remaining: 42.3s
1195:	learn: 0.3648407	total: 2m 46s	remaining: 42.2s
1196:	learn: 0.3648092	total: 2m 46s	remaining: 42.1s
1197:	learn: 0.3647742	total: 2m 46s	remaining: 41.9s
1198:	learn: 0.3647233	total: 2m 46s	remaining: 41.8s
1199:	learn: 0.3646895	total: 2m 46s	remaining: 41.6s
1200:	learn: 0.3646454	total: 2m 46s	remaining: 41.5s
1201:	learn: 0.3646005	total: 2m 46s	remaining: 41.4s
1202:	learn: 0.3645567	total: 2m 46s	remaining: 41.2s
1203:	learn: 0.3645041	total: 2m 47s	remaining: 41.1s
1204:	learn: 0.3644610	total: 2m 47s	remaining: 40.9s
1205:	learn: 0.3644356	total: 2m 47s	remaining: 40.8s
1206:	learn: 0.3643912	total: 2m 47s	remaining: 40.7s
1207:	learn: 0.3643478	total: 2m 47s	remaining: 40.5s
1208:	learn: 0.3643091	total: 2m 47s	remaining: 40.4s
1209:	learn: 0.3642447	total: 2m 47s	remaining: 40.2s
1210:	learn: 0.3642153	total: 2m 47s	remaining: 40.1s
1211:	learn: 0.3641765	total: 2m 48s	remaining: 39.9s
1212:	learn: 0.3641294	total: 2m 48s	remaining: 39.8s
1213:	learn: 0.3641051	total: 2m 48s	remaining: 39.7s
1214:	learn: 0.3640641	total: 2m 48s	remaining: 39.5s
1215:	learn: 0.3640139	total: 2m 48s	remaining: 39.4s
1216:	learn: 0.3639693	total: 2m 48s	remaining: 39.2s
1217:	learn: 0.3639383	total: 2m 48s	remaining: 39.1s
1218:	learn: 0.3638912	total: 2m 48s	remaining: 38.9s
1219:	learn: 0.3638597	total: 2m 49s	remaining: 38.8s
1220:	learn: 0.3638086	total: 2m 49s	remaining: 38.7s
1221:	learn: 0.3637619	total: 2m 49s	remaining: 38.5s
1222:	learn: 0.3637110	total: 2m 49s	remaining: 38.4s
1223:	learn: 0.3636685	total: 2m 49s	remaining: 38.2s
1224:	learn: 0.3636313	total: 2m 49s	remaining: 38.1s
1225:	learn: 0.3635932	total: 2m 49s	remaining: 38s
1226:	learn: 0.3635538	total: 2m 49s	remaining: 37.8s
1227:	learn: 0.3635177	total: 2m 50s	remaining: 37.7s
1228:	learn: 0.3634689	total: 2m 50s	remaining: 37.5s
1229:	learn: 0.3634337	total: 2m 50s	remaining: 37.4s
1230:	learn: 0.3633810	total: 2m 50s	remaining: 37.2s
1231:	learn: 0.3633530	total: 2m 50s	remaining: 37.1s
1232:	learn: 0.3633112	total: 2m 50s	remaining: 37s
1233:	learn: 0.3632709	total: 2m 50s	remaining: 36.8s
1234:	learn: 0.3632452	total: 2m 50s	remaining: 36.7s
1235:	learn: 0.3631949	total: 2m 51s	remaining: 36.5s
1236:	learn: 0.3631493	total: 2m 51s	remaining: 36.4s
1237:	learn: 0.3631131	total: 2m 51s	remaining: 36.3s
1238:	learn: 0.3630601	total: 2m 51s	remaining: 36.1s
1239:	learn: 0.3630121	total: 2m 51s	remaining: 36s
1240:	learn: 0.3629830	total: 2m 51s	remaining: 35.8s
1241:	learn: 0.3629533	total: 2m 51s	remaining: 35.7s
1242:	learn: 0.3629059	total: 2m 51s	remaining: 35.5s
1243:	learn: 0.3628630	total: 2m 52s	remaining: 35.4s
1244:	learn: 0.3628157	total: 2m 52s	remaining: 35.3s
1245:	learn: 0.3627714	total: 2m 52s	remaining: 35.1s
1246:	learn: 0.3627310	total: 2m 52s	remaining: 35s
1247:	learn: 0.3627039	total: 2m 52s	remaining: 34.8s
1248:	learn: 0.3626621	total: 2m 52s	remaining: 34.7s
1249:	learn: 0.3626294	total: 2m 52s	remaining: 34.5s
1250:	learn: 0.3625935	total: 2m 52s	remaining: 34.4s
1251:	learn: 0.3625416	total: 2m 52s	remaining: 34.3s
1252:	learn: 0.3624952	total: 2m 53s	remaining: 34.1s
1253:	learn: 0.3624737	total: 2m 53s	remaining: 34s
1254:	learn: 0.3624363	total: 2m 53s	remaining: 33.8s
1255:	learn: 0.3624073	total: 2m 53s	remaining: 33.7s
1256:	learn: 0.3623577	total: 2m 53s	remaining: 33.6s
1257:	learn: 0.3623122	total: 2m 53s	remaining: 33.4s
1258:	learn: 0.3622688	total: 2m 53s	remaining: 33.3s
1259:	learn: 0.3622408	total: 2m 53s	remaining: 33.1s
1260:	learn: 0.3622213	total: 2m 54s	remaining: 33s
1261:	learn: 0.3621759	total: 2m 54s	remaining: 32.9s
1262:	learn: 0.3621355	total: 2m 54s	remaining: 32.7s
1263:	learn: 0.3620953	total: 2m 54s	remaining: 32.6s
1264:	learn: 0.3620594	total: 2m 54s	remaining: 32.4s
1265:	learn: 0.3620185	total: 2m 54s	remaining: 32.3s
1266:	learn: 0.3619849	total: 2m 54s	remaining: 32.1s
1267:	learn: 0.3619460	total: 2m 54s	remaining: 32s
1268:	learn: 0.3619105	total: 2m 55s	remaining: 31.9s
1269:	learn: 0.3618746	total: 2m 55s	remaining: 31.7s
1270:	learn: 0.3618315	total: 2m 55s	remaining: 31.6s
1271:	learn: 0.3617918	total: 2m 55s	remaining: 31.5s
1272:	learn: 0.3617484	total: 2m 55s	remaining: 31.3s
1273:	learn: 0.3617098	total: 2m 55s	remaining: 31.2s
1274:	learn: 0.3616750	total: 2m 55s	remaining: 31s
1275:	learn: 0.3616322	total: 2m 56s	remaining: 30.9s
1276:	learn: 0.3615948	total: 2m 56s	remaining: 30.8s
1277:	learn: 0.3615468	total: 2m 56s	remaining: 30.6s
1278:	learn: 0.3615078	total: 2m 56s	remaining: 30.5s
1279:	learn: 0.3614703	total: 2m 56s	remaining: 30.4s
1280:	learn: 0.3614443	total: 2m 56s	remaining: 30.2s
1281:	learn: 0.3613963	total: 2m 56s	remaining: 30.1s
1282:	learn: 0.3613575	total: 2m 57s	remaining: 30s
1283:	learn: 0.3613104	total: 2m 57s	remaining: 29.8s
1284:	learn: 0.3612746	total: 2m 57s	remaining: 29.7s
1285:	learn: 0.3612285	total: 2m 57s	remaining: 29.5s
1286:	learn: 0.3611899	total: 2m 57s	remaining: 29.4s
1287:	learn: 0.3611408	total: 2m 57s	remaining: 29.3s
1288:	learn: 0.3610895	total: 2m 57s	remaining: 29.1s
1289:	learn: 0.3610438	total: 2m 58s	remaining: 29s
1290:	learn: 0.3610179	total: 2m 58s	remaining: 28.8s
1291:	learn: 0.3609730	total: 2m 58s	remaining: 28.7s
1292:	learn: 0.3609301	total: 2m 58s	remaining: 28.6s
1293:	learn: 0.3609009	total: 2m 58s	remaining: 28.4s
1294:	learn: 0.3608725	total: 2m 58s	remaining: 28.3s
1295:	learn: 0.3608447	total: 2m 58s	remaining: 28.1s
1296:	learn: 0.3608000	total: 2m 58s	remaining: 28s
1297:	learn: 0.3607582	total: 2m 59s	remaining: 27.9s
1298:	learn: 0.3607309	total: 2m 59s	remaining: 27.7s
1299:	learn: 0.3606926	total: 2m 59s	remaining: 27.6s
1300:	learn: 0.3606509	total: 2m 59s	remaining: 27.4s
1301:	learn: 0.3606097	total: 2m 59s	remaining: 27.3s
1302:	learn: 0.3605526	total: 2m 59s	remaining: 27.2s
1303:	learn: 0.3605295	total: 2m 59s	remaining: 27s
1304:	learn: 0.3604954	total: 2m 59s	remaining: 26.9s
1305:	learn: 0.3604551	total: 3m	remaining: 26.7s
1306:	learn: 0.3604154	total: 3m	remaining: 26.6s
1307:	learn: 0.3603847	total: 3m	remaining: 26.5s
1308:	learn: 0.3603432	total: 3m	remaining: 26.3s
1309:	learn: 0.3602991	total: 3m	remaining: 26.2s
1310:	learn: 0.3602567	total: 3m	remaining: 26.1s
1311:	learn: 0.3602128	total: 3m	remaining: 25.9s
1312:	learn: 0.3601897	total: 3m	remaining: 25.8s
1313:	learn: 0.3601452	total: 3m 1s	remaining: 25.6s
1314:	learn: 0.3601028	total: 3m 1s	remaining: 25.5s
1315:	learn: 0.3600518	total: 3m 1s	remaining: 25.4s
1316:	learn: 0.3600140	total: 3m 1s	remaining: 25.2s
1317:	learn: 0.3599781	total: 3m 1s	remaining: 25.1s
1318:	learn: 0.3599371	total: 3m 1s	remaining: 24.9s
1319:	learn: 0.3598946	total: 3m 1s	remaining: 24.8s
1320:	learn: 0.3598574	total: 3m 1s	remaining: 24.7s
1321:	learn: 0.3598158	total: 3m 2s	remaining: 24.5s
1322:	learn: 0.3597757	total: 3m 2s	remaining: 24.4s
1323:	learn: 0.3597431	total: 3m 2s	remaining: 24.2s
1324:	learn: 0.3596943	total: 3m 2s	remaining: 24.1s
1325:	learn: 0.3596495	total: 3m 2s	remaining: 24s
1326:	learn: 0.3596090	total: 3m 2s	remaining: 23.8s
1327:	learn: 0.3595589	total: 3m 2s	remaining: 23.7s
1328:	learn: 0.3595286	total: 3m 2s	remaining: 23.5s
1329:	learn: 0.3594890	total: 3m 3s	remaining: 23.4s
1330:	learn: 0.3594571	total: 3m 3s	remaining: 23.3s
1331:	learn: 0.3594033	total: 3m 3s	remaining: 23.1s
1332:	learn: 0.3593578	total: 3m 3s	remaining: 23s
1333:	learn: 0.3593202	total: 3m 3s	remaining: 22.9s
1334:	learn: 0.3592814	total: 3m 3s	remaining: 22.7s
1335:	learn: 0.3592379	total: 3m 3s	remaining: 22.6s
1336:	learn: 0.3591935	total: 3m 4s	remaining: 22.4s
1337:	learn: 0.3591522	total: 3m 4s	remaining: 22.3s
1338:	learn: 0.3591090	total: 3m 4s	remaining: 22.2s
1339:	learn: 0.3590660	total: 3m 4s	remaining: 22s
1340:	learn: 0.3590225	total: 3m 4s	remaining: 21.9s
1341:	learn: 0.3589723	total: 3m 4s	remaining: 21.7s
1342:	learn: 0.3589290	total: 3m 4s	remaining: 21.6s
1343:	learn: 0.3588950	total: 3m 4s	remaining: 21.5s
1344:	learn: 0.3588567	total: 3m 5s	remaining: 21.3s
1345:	learn: 0.3588299	total: 3m 5s	remaining: 21.2s
1346:	learn: 0.3587880	total: 3m 5s	remaining: 21s
1347:	learn: 0.3587750	total: 3m 5s	remaining: 20.9s
1348:	learn: 0.3587394	total: 3m 5s	remaining: 20.8s
1349:	learn: 0.3586936	total: 3m 5s	remaining: 20.6s
1350:	learn: 0.3586533	total: 3m 5s	remaining: 20.5s
1351:	learn: 0.3586075	total: 3m 5s	remaining: 20.4s
1352:	learn: 0.3585597	total: 3m 6s	remaining: 20.2s
1353:	learn: 0.3585215	total: 3m 6s	remaining: 20.1s
1354:	learn: 0.3584739	total: 3m 6s	remaining: 19.9s
1355:	learn: 0.3584413	total: 3m 6s	remaining: 19.8s
1356:	learn: 0.3584026	total: 3m 6s	remaining: 19.7s
1357:	learn: 0.3583617	total: 3m 6s	remaining: 19.5s
1358:	learn: 0.3583323	total: 3m 6s	remaining: 19.4s
1359:	learn: 0.3582935	total: 3m 6s	remaining: 19.2s
1360:	learn: 0.3582573	total: 3m 7s	remaining: 19.1s
1361:	learn: 0.3581977	total: 3m 7s	remaining: 19s
1362:	learn: 0.3581530	total: 3m 7s	remaining: 18.8s
1363:	learn: 0.3581208	total: 3m 7s	remaining: 18.7s
1364:	learn: 0.3580750	total: 3m 7s	remaining: 18.6s
1365:	learn: 0.3580409	total: 3m 7s	remaining: 18.4s
1366:	learn: 0.3579973	total: 3m 7s	remaining: 18.3s
1367:	learn: 0.3579666	total: 3m 8s	remaining: 18.1s
1368:	learn: 0.3579252	total: 3m 8s	remaining: 18s
1369:	learn: 0.3578912	total: 3m 8s	remaining: 17.9s
1370:	learn: 0.3578475	total: 3m 8s	remaining: 17.7s
1371:	learn: 0.3577987	total: 3m 8s	remaining: 17.6s
1372:	learn: 0.3577501	total: 3m 8s	remaining: 17.5s
1373:	learn: 0.3577209	total: 3m 8s	remaining: 17.3s
1374:	learn: 0.3576843	total: 3m 8s	remaining: 17.2s
1375:	learn: 0.3576477	total: 3m 9s	remaining: 17s
1376:	learn: 0.3576071	total: 3m 9s	remaining: 16.9s
1377:	learn: 0.3575695	total: 3m 9s	remaining: 16.8s
1378:	learn: 0.3575260	total: 3m 9s	remaining: 16.6s
1379:	learn: 0.3574841	total: 3m 9s	remaining: 16.5s
1380:	learn: 0.3574415	total: 3m 9s	remaining: 16.3s
1381:	learn: 0.3573950	total: 3m 9s	remaining: 16.2s
1382:	learn: 0.3573692	total: 3m 9s	remaining: 16.1s
1383:	learn: 0.3573245	total: 3m 10s	remaining: 15.9s
1384:	learn: 0.3572899	total: 3m 10s	remaining: 15.8s
1385:	learn: 0.3572516	total: 3m 10s	remaining: 15.7s
1386:	learn: 0.3572055	total: 3m 10s	remaining: 15.5s
1387:	learn: 0.3571712	total: 3m 10s	remaining: 15.4s
1388:	learn: 0.3571327	total: 3m 10s	remaining: 15.2s
1389:	learn: 0.3571032	total: 3m 10s	remaining: 15.1s
1390:	learn: 0.3570525	total: 3m 10s	remaining: 15s
1391:	learn: 0.3570100	total: 3m 11s	remaining: 14.8s
1392:	learn: 0.3569721	total: 3m 11s	remaining: 14.7s
1393:	learn: 0.3569151	total: 3m 11s	remaining: 14.5s
1394:	learn: 0.3568830	total: 3m 11s	remaining: 14.4s
1395:	learn: 0.3568565	total: 3m 11s	remaining: 14.3s
1396:	learn: 0.3568099	total: 3m 11s	remaining: 14.1s
1397:	learn: 0.3567733	total: 3m 11s	remaining: 14s
1398:	learn: 0.3567390	total: 3m 11s	remaining: 13.9s
1399:	learn: 0.3567212	total: 3m 12s	remaining: 13.7s
1400:	learn: 0.3566849	total: 3m 12s	remaining: 13.6s
1401:	learn: 0.3566472	total: 3m 12s	remaining: 13.4s
1402:	learn: 0.3566104	total: 3m 12s	remaining: 13.3s
1403:	learn: 0.3565758	total: 3m 12s	remaining: 13.2s
1404:	learn: 0.3565240	total: 3m 12s	remaining: 13s
1405:	learn: 0.3564886	total: 3m 12s	remaining: 12.9s
1406:	learn: 0.3564539	total: 3m 13s	remaining: 12.8s
1407:	learn: 0.3564344	total: 3m 13s	remaining: 12.6s
1408:	learn: 0.3563900	total: 3m 13s	remaining: 12.5s
1409:	learn: 0.3563518	total: 3m 13s	remaining: 12.3s
1410:	learn: 0.3563167	total: 3m 13s	remaining: 12.2s
1411:	learn: 0.3562814	total: 3m 13s	remaining: 12.1s
1412:	learn: 0.3562565	total: 3m 13s	remaining: 11.9s
1413:	learn: 0.3562192	total: 3m 13s	remaining: 11.8s
1414:	learn: 0.3561777	total: 3m 13s	remaining: 11.7s
1415:	learn: 0.3561299	total: 3m 14s	remaining: 11.5s
1416:	learn: 0.3560959	total: 3m 14s	remaining: 11.4s
1417:	learn: 0.3560520	total: 3m 14s	remaining: 11.2s
1418:	learn: 0.3560077	total: 3m 14s	remaining: 11.1s
1419:	learn: 0.3559668	total: 3m 14s	remaining: 11s
1420:	learn: 0.3559231	total: 3m 14s	remaining: 10.8s
1421:	learn: 0.3558895	total: 3m 14s	remaining: 10.7s
1422:	learn: 0.3558417	total: 3m 15s	remaining: 10.6s
1423:	learn: 0.3558053	total: 3m 15s	remaining: 10.4s
1424:	learn: 0.3557704	total: 3m 15s	remaining: 10.3s
1425:	learn: 0.3557424	total: 3m 15s	remaining: 10.1s
1426:	learn: 0.3557044	total: 3m 15s	remaining: 10s
1427:	learn: 0.3556621	total: 3m 15s	remaining: 9.87s
1428:	learn: 0.3556341	total: 3m 15s	remaining: 9.73s
1429:	learn: 0.3555891	total: 3m 15s	remaining: 9.59s
1430:	learn: 0.3555440	total: 3m 16s	remaining: 9.45s
1431:	learn: 0.3554991	total: 3m 16s	remaining: 9.32s
1432:	learn: 0.3554555	total: 3m 16s	remaining: 9.18s
1433:	learn: 0.3554129	total: 3m 16s	remaining: 9.04s
1434:	learn: 0.3553949	total: 3m 16s	remaining: 8.9s
1435:	learn: 0.3553631	total: 3m 16s	remaining: 8.77s
1436:	learn: 0.3553180	total: 3m 16s	remaining: 8.63s
1437:	learn: 0.3552792	total: 3m 17s	remaining: 8.49s
1438:	learn: 0.3552439	total: 3m 17s	remaining: 8.36s
1439:	learn: 0.3552021	total: 3m 17s	remaining: 8.22s
1440:	learn: 0.3551645	total: 3m 17s	remaining: 8.08s
1441:	learn: 0.3551349	total: 3m 17s	remaining: 7.94s
1442:	learn: 0.3551026	total: 3m 17s	remaining: 7.81s
1443:	learn: 0.3550558	total: 3m 17s	remaining: 7.67s
1444:	learn: 0.3550184	total: 3m 17s	remaining: 7.53s
1445:	learn: 0.3549870	total: 3m 18s	remaining: 7.39s
1446:	learn: 0.3549426	total: 3m 18s	remaining: 7.26s
1447:	learn: 0.3549223	total: 3m 18s	remaining: 7.12s
1448:	learn: 0.3548782	total: 3m 18s	remaining: 6.98s
1449:	learn: 0.3548481	total: 3m 18s	remaining: 6.84s
1450:	learn: 0.3548072	total: 3m 18s	remaining: 6.71s
1451:	learn: 0.3547660	total: 3m 18s	remaining: 6.57s
1452:	learn: 0.3547222	total: 3m 18s	remaining: 6.43s
1453:	learn: 0.3547216	total: 3m 18s	remaining: 6.29s
1454:	learn: 0.3546753	total: 3m 19s	remaining: 6.16s
1455:	learn: 0.3546394	total: 3m 19s	remaining: 6.02s
1456:	learn: 0.3546027	total: 3m 19s	remaining: 5.88s
1457:	learn: 0.3545656	total: 3m 19s	remaining: 5.75s
1458:	learn: 0.3545153	total: 3m 19s	remaining: 5.61s
1459:	learn: 0.3544764	total: 3m 19s	remaining: 5.47s
1460:	learn: 0.3544495	total: 3m 19s	remaining: 5.33s
1461:	learn: 0.3544075	total: 3m 19s	remaining: 5.2s
1462:	learn: 0.3543666	total: 3m 20s	remaining: 5.06s
1463:	learn: 0.3543456	total: 3m 20s	remaining: 4.92s
1464:	learn: 0.3543029	total: 3m 20s	remaining: 4.78s
1465:	learn: 0.3542576	total: 3m 20s	remaining: 4.65s
1466:	learn: 0.3542323	total: 3m 20s	remaining: 4.51s
1467:	learn: 0.3541965	total: 3m 20s	remaining: 4.37s
1468:	learn: 0.3541615	total: 3m 20s	remaining: 4.24s
1469:	learn: 0.3541308	total: 3m 20s	remaining: 4.1s
1470:	learn: 0.3540908	total: 3m 20s	remaining: 3.96s
1471:	learn: 0.3540551	total: 3m 21s	remaining: 3.83s
1472:	learn: 0.3540110	total: 3m 21s	remaining: 3.69s
1473:	learn: 0.3539682	total: 3m 21s	remaining: 3.55s
1474:	learn: 0.3539258	total: 3m 21s	remaining: 3.41s
1475:	learn: 0.3538742	total: 3m 21s	remaining: 3.28s
1476:	learn: 0.3538359	total: 3m 21s	remaining: 3.14s
1477:	learn: 0.3538117	total: 3m 21s	remaining: 3s
1478:	learn: 0.3537667	total: 3m 21s	remaining: 2.87s
1479:	learn: 0.3537261	total: 3m 22s	remaining: 2.73s
1480:	learn: 0.3536988	total: 3m 22s	remaining: 2.59s
1481:	learn: 0.3536566	total: 3m 22s	remaining: 2.46s
1482:	learn: 0.3536259	total: 3m 22s	remaining: 2.32s
1483:	learn: 0.3535815	total: 3m 22s	remaining: 2.18s
1484:	learn: 0.3535458	total: 3m 22s	remaining: 2.05s
1485:	learn: 0.3535259	total: 3m 22s	remaining: 1.91s
1486:	learn: 0.3534827	total: 3m 23s	remaining: 1.77s
1487:	learn: 0.3534467	total: 3m 23s	remaining: 1.64s
1488:	learn: 0.3534038	total: 3m 23s	remaining: 1.5s
1489:	learn: 0.3533546	total: 3m 23s	remaining: 1.36s
1490:	learn: 0.3533052	total: 3m 23s	remaining: 1.23s
1491:	learn: 0.3532728	total: 3m 23s	remaining: 1.09s
1492:	learn: 0.3532336	total: 3m 23s	remaining: 956ms
1493:	learn: 0.3531886	total: 3m 23s	remaining: 819ms
1494:	learn: 0.3531436	total: 3m 24s	remaining: 682ms
1495:	learn: 0.3531004	total: 3m 24s	remaining: 546ms
1496:	learn: 0.3530737	total: 3m 24s	remaining: 409ms
1497:	learn: 0.3530229	total: 3m 24s	remaining: 273ms
1498:	learn: 0.3529743	total: 3m 24s	remaining: 136ms
1499:	learn: 0.3529358	total: 3m 24s	remaining: 0us
Wall time: 6min 28s
Out[15]:
VotingClassifier(estimators=[('xgbc',
                              XGBClassifier(base_score=None, booster='gbtree',
                                            colsample_bylevel=None,
                                            colsample_bynode=1,
                                            colsample_bytree=1,
                                            enable_categorical=False, gamma=0,
                                            gpu_id=None, importance_type=None,
                                            interaction_constraints=None,
                                            learning_rate=0.3,
                                            max_delta_step=None, max_depth=6,
                                            min_child_weight=None, missing=nan,
                                            monotone_constraints=None,
                                            n_estimators=103, n_jobs=None,
                                            num_parallel_tree=None,
                                            predictor=None, random_state=69,
                                            reg_alpha=None, reg_lambda=None,
                                            scale_pos_weight=None, subsample=1,
                                            tree_method=None,
                                            validate_parameters=None,
                                            verbosity=None)),
                             ('lgbc',
                              LGBMClassifier(n_estimators=2000,
                                             objective='binary',
                                             random_state=69)),
                             ('catgbc',
                              <catboost.core.CatBoostClassifier object at 0x000001B87ED74E20>)],
                 voting='soft')

Compute FPR + FNR score

In [16]:
voting_clf_y_pred = voting_clf.predict(voting_gbc_X_valid)
valid_score = criterion(voting_clf_y_pred, voting_gbc_y_valid)
print('FPR + FNR = {}'.format(valid_score))
FPR + FNR = 0.38417217182315866

To further inspect the performance

In [17]:
# to further inspect the performance:
CM = confusion_matrix(voting_gbc_y_valid, voting_clf_y_pred)
TN, TP = CM[0, 0], CM[1, 1]
FP, FN = CM[0, 1], CM[1, 0]
print('Confusion Matrix: \n {}'.format(CM))
print('Accuracy: {}'.format((TP + TN) / (TP + TN + FP + FN)))  
print('False Positive Rate: {}'.format(FP / (FP + TN)))  
print('False Negative Rate: {}'.format(FN / (FN + TP)))
print('FPR + FNR = {}'.format(FP / (FP + TN) + FN / (FN + TP)))
plt.figure(figsize=(6,4))
plt.grid()
gb_y_prob = voting_clf.predict_proba(voting_gbc_X_valid)[:, 1]
fpr, tpr, thresholds = roc_curve(voting_gbc_y_valid, gb_y_prob, pos_label=1)
idx = np.argmin(fpr + (1-tpr))
plt.plot(fpr, 1-tpr, label='RF')
plt.plot(fpr[idx], (1-tpr)[idx], '+', color='k')
plt.legend(loc='best')
plt.xlabel('FPR')
plt.ylabel('FNR')
plt.show()
Confusion Matrix: 
 [[24456  5234]
 [ 6186 23571]]
Accuracy: 0.8078961091392333
False Positive Rate: 0.17628831256315258
False Negative Rate: 0.20788385926000605
FPR + FNR = 0.38417217182315866

Result discussion

The $FPR$ and $FNR$ rate is the lower for this notebook: 0.384. The voting system provides that encompass Three boosting models give better performance than XGBoost, CatBoost and LightGBM on the binary classification task. But a voting system with a majority vote generates better performance than a soft vote which predict the class label based on the argmax of the sums of the predicted probabilities. So the best model remains the voting system with a majority vote.

12/ Neural Network

Without no doubt, nowadays neural networks outperform most traditional approaches. For computer vision tasks they are one of the best options. However, neural network approaches require an adequate architecture and suffer from a training time that can be very long. Moreover, to train a neural network efficiently, it requires a large volume of data.

To be efficient, neural networks need an adequate architecture based on the stacking of different layers.

In order to find the best architecture, I have made several attempts and the architecture I have chosen is a neural network with 3 dense layers composed of 48 neurons (layer_size = 48) per layer associated with a ReLu type activation function. I also use batch normalisation for improved performance. To prevent the risk of overfitting I set a dropout at 0.5 at each layer. Indeed, overfitting is a frequent problem when training a Deep Learning model, but a technique exists to counter it: the Dropout. In fact, the term "Dropout" refers to the deletion of neurons in the layers of a Deep Learning model. In fact, we temporarily deactivate some neurons in the network, as well as all its input and output connections. More concretely, At each epoch, we apply this random deactivation. That is to say that at each pass (forward propagation) the model will learn with a different configuration of neurons, the neurons being randomly activated and deactivated.

Regarding the training strategy of the neural network, I chose to train the neural network on the entire training dataset without taking into account the selected variables because a neural network requires a large volume of data to be effective. Thus the parameters chosen for the training are the following:

  • epoch = 20
  • batch_size = 1024

We note that:

  • Batch_size is the size of a training batch processed before the model parameters are updated. Choosing a batch size that is too small will introduce a high degree of variance (noise) into each batch because a small sample is unlikely to be a good representation of the data set. Conversely, if a batch size is too large, it may not fit in the memory of the computational instance used for training and it will tend to overfit the data.
  • Epoch is the number of complete learning cycles. At the end of each complete learning cycle, the algorithm is able to adapt its parameters to make better predictions for the next learning cycle. Too many epochs can cause the learning model to over-train (over-fit). There is no magic rule to choose the number of epochs

Split data

In [157]:
# split into X_train, y_train, X_valid and y_valid
nn_X_train, nn_X_valid, nn_y_train, nn_y_valid = train_test_split(X_dataframe, y_dataframe, test_size=0.2, random_state=12)

Convert data to numpy

In [158]:
# convert data to numpy
nn_X_train = np.array(nn_X_train)
nn_y_train = np.ravel(np.array(nn_y_train))
nn_X_valid = np.array(nn_X_valid)
nn_y_valid = np.array(nn_y_valid)

Define the neural network hyperparameters

In [159]:
# architecture of the network
nn_epochs = 20
nn_batch_size = 1024
nn_verbose = 1
nn_input_size = nn_X_train.shape[1]
nn_layer_size = 48 # neurons number
nn_validation_split = 0.1
nn_dropout = 0.5 # manage overfitting

Define the neural network architecture

In [160]:
# building of the model
model = tf.keras.models.Sequential()

# dense layers
model.add(keras.layers.Dense(nn_layer_size,input_shape=(nn_input_size,), name='dense_layer_1', use_bias=False))
model.add(keras.layers.BatchNormalization())
model.add(keras.layers.Activation("relu"))
model.add(keras.layers.Dropout(nn_dropout))

model.add(keras.layers.Dense(nn_layer_size, name='dense_layer_2', use_bias=False))
model.add(keras.layers.BatchNormalization())
model.add(keras.layers.Activation("relu"))
model.add(keras.layers.Dropout(nn_dropout))

model.add(keras.layers.Dense(nn_layer_size, name='dense_layer_3', use_bias=False))
model.add(keras.layers.BatchNormalization())
model.add(keras.layers.Activation("relu"))
model.add(keras.layers.Dropout(nn_dropout))

# decision layer
model.add(keras.layers.Dense(1, name='dense_layer_final', activation='sigmoid'))

Summary of the neural network architecture

In [161]:
# summary of the model
model.summary()
Model: "sequential"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 dense_layer_1 (Dense)       (None, 48)                4608      
                                                                 
 batch_normalization (BatchN  (None, 48)               192       
 ormalization)                                                   
                                                                 
 activation (Activation)     (None, 48)                0         
                                                                 
 dropout (Dropout)           (None, 48)                0         
                                                                 
 dense_layer_2 (Dense)       (None, 48)                2304      
                                                                 
 batch_normalization_1 (Batc  (None, 48)               192       
 hNormalization)                                                 
                                                                 
 activation_1 (Activation)   (None, 48)                0         
                                                                 
 dropout_1 (Dropout)         (None, 48)                0         
                                                                 
 dense_layer_3 (Dense)       (None, 48)                2304      
                                                                 
 batch_normalization_2 (Batc  (None, 48)               192       
 hNormalization)                                                 
                                                                 
 activation_2 (Activation)   (None, 48)                0         
                                                                 
 dropout_2 (Dropout)         (None, 48)                0         
                                                                 
 dense_layer_final (Dense)   (None, 1)                 49        
                                                                 
=================================================================
Total params: 9,841
Trainable params: 9,553
Non-trainable params: 288
_________________________________________________________________

Compile the neural network

In [162]:
# compilation
model.compile(optimizer='adam', loss='binary_crossentropy', metrics=["accuracy"])

Train the neural network

In [163]:
%%time
# training of the model
history = model.fit(nn_X_train, nn_y_train, batch_size=nn_batch_size, epochs=nn_epochs, verbose=nn_verbose, \
         validation_split=nn_validation_split);
Epoch 1/20
209/209 [==============================] - 4s 12ms/step - loss: 0.6650 - accuracy: 0.6329 - val_loss: 1.0905 - val_accuracy: 0.5619
Epoch 2/20
209/209 [==============================] - 2s 10ms/step - loss: 0.5636 - accuracy: 0.7221 - val_loss: 0.6119 - val_accuracy: 0.6950
Epoch 3/20
209/209 [==============================] - 2s 11ms/step - loss: 0.5439 - accuracy: 0.7371 - val_loss: 0.5287 - val_accuracy: 0.7438
Epoch 4/20
209/209 [==============================] - 2s 11ms/step - loss: 0.5337 - accuracy: 0.7445 - val_loss: 0.5113 - val_accuracy: 0.7489
Epoch 5/20
209/209 [==============================] - 2s 11ms/step - loss: 0.5245 - accuracy: 0.7503 - val_loss: 0.5045 - val_accuracy: 0.7551
Epoch 6/20
209/209 [==============================] - 2s 10ms/step - loss: 0.5190 - accuracy: 0.7534 - val_loss: 0.5008 - val_accuracy: 0.7562
Epoch 7/20
209/209 [==============================] - 2s 10ms/step - loss: 0.5137 - accuracy: 0.7577 - val_loss: 0.5019 - val_accuracy: 0.7567
Epoch 8/20
209/209 [==============================] - 2s 11ms/step - loss: 0.5091 - accuracy: 0.7594 - val_loss: 0.5042 - val_accuracy: 0.7562
Epoch 9/20
209/209 [==============================] - 2s 10ms/step - loss: 0.5049 - accuracy: 0.7619 - val_loss: 0.4968 - val_accuracy: 0.7669
Epoch 10/20
209/209 [==============================] - 2s 10ms/step - loss: 0.5008 - accuracy: 0.7636 - val_loss: 0.4974 - val_accuracy: 0.7638
Epoch 11/20
209/209 [==============================] - 2s 10ms/step - loss: 0.4966 - accuracy: 0.7672 - val_loss: 0.4943 - val_accuracy: 0.7624
Epoch 12/20
209/209 [==============================] - 2s 10ms/step - loss: 0.4948 - accuracy: 0.7675 - val_loss: 0.4936 - val_accuracy: 0.7596
Epoch 13/20
209/209 [==============================] - 2s 10ms/step - loss: 0.4917 - accuracy: 0.7700 - val_loss: 0.4851 - val_accuracy: 0.7755
Epoch 14/20
209/209 [==============================] - 2s 11ms/step - loss: 0.4906 - accuracy: 0.7687 - val_loss: 0.4821 - val_accuracy: 0.7698
Epoch 15/20
209/209 [==============================] - 2s 11ms/step - loss: 0.4875 - accuracy: 0.7716 - val_loss: 0.4954 - val_accuracy: 0.7591
Epoch 16/20
209/209 [==============================] - 2s 11ms/step - loss: 0.4863 - accuracy: 0.7726 - val_loss: 0.4791 - val_accuracy: 0.7785
Epoch 17/20
209/209 [==============================] - 2s 10ms/step - loss: 0.4843 - accuracy: 0.7746 - val_loss: 0.4887 - val_accuracy: 0.7711
Epoch 18/20
209/209 [==============================] - 2s 10ms/step - loss: 0.4831 - accuracy: 0.7751 - val_loss: 0.4790 - val_accuracy: 0.7763
Epoch 19/20
209/209 [==============================] - 2s 11ms/step - loss: 0.4819 - accuracy: 0.7752 - val_loss: 0.4732 - val_accuracy: 0.7735
Epoch 20/20
209/209 [==============================] - 2s 10ms/step - loss: 0.4805 - accuracy: 0.7760 - val_loss: 0.4748 - val_accuracy: 0.7769
Wall time: 46.1 s

Display the evolution of the accuracy and the loss function at each epoch

In [164]:
# list all data in history
print(history.history.keys())
# summarize history for accuracy
plt.plot(history.history['accuracy'])
plt.plot(history.history['val_accuracy'])
plt.title('model accuracy')
plt.ylabel('accuracy')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()
# summarize history for loss
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('model loss')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()
dict_keys(['loss', 'accuracy', 'val_loss', 'val_accuracy'])

Compute FPR + FNR score

In [165]:
# compute FPR + FNR score

# set a threshold to convert sigmoid ouput probablity to binary value {True, False}
nn_y_pred = tf.greater(model.predict(nn_X_valid), 0.5)
# convert tensor of {True, False} binary value to int {1, 0} value
nn_y_pred = nn_y_pred.numpy().astype(int)
valid_score = criterion(nn_y_pred, nn_y_valid)
print('FPR + FNR = {}'.format(valid_score))
FPR + FNR = 0.44335784801049566

Result discussion

The $FPR$ and $FNR$ rate is not the slower of this notebook: 0.44. Indeed, boosting algorithm suach as XGBoost, LightGBM and CatBoost great better with $FPR$ and $FNR$ rate at 0.40 than the neural network. This poor performance can be explained by the fact that the selected architecture may not be the best one available.

Let's try co generate new features

The objective is to extend the number of characteristics of the training dataset and apply my best model which is the combination of XGBoost, LightGBM and CatBoost models in a voting classifier with the same hyperparameters

To do this, it is necessary to create linear combinations between the vectors Z1 and Z2 such as Z1+Z2, Z1-Z2

Generation of new features for the X_dataframe training dataset

In [8]:
X_dataframe_enlarged = X_dataframe.copy()

for i in range(X_dataframe_enlarged.shape[1]):
    col_A = X_dataframe_enlarged.iloc[:,i]
    col_B = X_dataframe_enlarged.iloc[:,48+i]
    X_dataframe_enlarged["col_"+str(i)+"minus_col_"+str(48+i)] = col_A - col_B
    X_dataframe_enlarged["col_"+str(i)+"plus_col_"+str(48+i)] = col_A + col_B
    X_dataframe_enlarged["col_"+str(i)+"dot_col_"+str(48+i)] = col_A * col_B
    
X_dataframe_enlarged
Out[8]:
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 ... col_79plus_col_127 col_79dot_col_127 col_80minus_col_128 col_80plus_col_128 col_80dot_col_128 col_81minus_col_129 col_81plus_col_129 col_81dot_col_129 col_82minus_col_130 col_82plus_col_130 col_82dot_col_130 col_83minus_col_131 col_83plus_col_131 col_83dot_col_131 col_84minus_col_132 col_84plus_col_132 col_84dot_col_132 col_85minus_col_133 col_85plus_col_133 col_85dot_col_133 col_86minus_col_134 col_86plus_col_134 col_86dot_col_134 col_87minus_col_135 col_87plus_col_135 col_87dot_col_135 col_88minus_col_136 col_88plus_col_136 col_88dot_col_136 col_89minus_col_137 col_89plus_col_137 col_89dot_col_137 col_90minus_col_138 col_90plus_col_138 col_90dot_col_138 col_91minus_col_139 col_91plus_col_139 col_91dot_col_139 col_92minus_col_140 col_92plus_col_140 col_92dot_col_140 col_93minus_col_141 col_93plus_col_141 col_93dot_col_141 col_94minus_col_142 col_94plus_col_142 col_94dot_col_142 col_95minus_col_143 col_95plus_col_143 col_95dot_col_143
0 236.031 1.000 0.000 0.000 -2.134 -3.195 1.100 23.831 1.0 0.001 1.0 1.0 0.0 0.0 0.000 255.0 255.0 193.0 255.0 160.0 0.0 217.0 79.0 175.0 194.0 221.0 254.0 255.0 254.0 245.0 7.806 4.148 -0.411 2.673 1.451 84.0 0.945000 0.986000 0.621000 0.993000 0.686000 0.375000 0.998000 0.801000 0.0 3.021000 0.0 0.0 238.253 1.000 ... 3.926 3.852 25.732 27.732 26.732 3.698 3.698 0.000 -3.463 0.537 -2.926 77.0 79.0 78.0 0.929000 0.929000 0.0 0.988000 0.988000 0.0 0.643000 0.643000 0.0 0.912000 0.912000 0.000 0.995000 0.995000 0.000 0.335000 0.335000 0.000 0.995000 0.995000 0.000000 0.829000 0.829000 0.000000 0.000000 0.000000 0.0 2.503000 2.503000 0.0 -510.0 510.0 0.0 -65025.0 65025.0 0.0
1 228.040 1.000 0.000 0.000 -0.860 -15.950 -1.410 58.040 0.0 1.000 1.0 1.0 0.0 0.0 0.000 255.0 255.0 255.0 212.0 255.0 255.0 213.0 96.0 172.0 188.0 226.0 254.0 196.0 225.0 210.0 2.570 1.130 -2.610 -4.070 -2.220 121.0 0.823407 0.993560 0.918158 0.858533 0.932080 0.649915 0.902721 0.869304 0.0 2.331588 1.0 0.0 107.160 1.000 ... 3.130 2.260 4.470 6.470 5.470 23.950 23.950 0.000 2.720 6.720 9.440 95.0 97.0 96.0 0.985727 0.985727 0.0 0.996261 0.996261 0.0 0.637621 0.637621 0.0 0.847861 0.847861 0.000 0.927677 0.927677 0.000 0.403857 0.403857 0.000 1.648128 0.308128 -0.655346 0.156418 1.496418 0.553700 0.000000 0.000000 0.0 2.245238 2.245238 0.0 -509.0 511.0 510.0 -65025.0 65025.0 0.0
2 158.310 1.000 0.000 0.000 -2.290 -7.680 0.120 22.770 1.0 0.000 1.0 1.0 0.0 0.0 0.000 255.0 201.0 255.0 248.0 0.0 255.0 220.0 236.0 175.0 194.0 202.0 254.0 255.0 254.0 247.0 5.660 1.540 -1.620 -0.060 -0.750 63.0 0.616242 0.985232 0.961017 0.995382 0.696401 0.768143 0.993044 0.860056 0.0 2.580157 0.0 0.0 214.620 1.000 ... 3.850 3.700 1.830 3.830 2.830 -4.280 -4.280 -0.000 0.890 4.890 5.780 59.0 61.0 60.0 0.718895 0.718895 0.0 0.999113 0.999113 0.0 0.611766 0.611766 0.0 0.617831 0.617831 0.000 0.864550 0.864550 0.000 0.388376 0.388376 0.000 1.642516 0.202516 -0.664212 0.011915 1.451915 0.526979 0.000000 0.000000 0.0 3.006712 3.006712 0.0 -509.0 511.0 510.0 -65025.0 65025.0 0.0
3 165.464 0.284 0.000 0.716 38.303 -16.267 -9.469 45.229 1.0 0.003 1.0 1.0 0.0 0.0 0.001 255.0 150.0 93.0 169.0 32.0 0.0 110.0 0.0 175.0 171.0 130.0 11.0 221.0 225.0 137.0 -10.177 2.300 57.818 1.508 1.880 56.0 0.897000 0.992000 0.500000 0.826000 0.993000 0.853000 0.990000 0.865000 0.0 -0.883000 0.0 0.0 173.099 1.000 ... 6.290 8.580 -36.607 -34.607 -35.607 3.839 5.839 4.839 -6.786 -4.786 -5.786 68.0 68.0 0.0 0.884000 0.884000 0.0 0.993000 0.993000 0.0 0.836000 0.836000 0.0 1.834000 -0.166000 -0.834 -0.061000 1.939000 0.939 0.892000 0.892000 0.000 0.991000 0.991000 0.000000 0.909000 0.913000 0.001822 -0.000001 0.000001 0.0 -2.930000 -2.930000 -0.0 -510.0 510.0 0.0 -65025.0 65025.0 0.0
4 153.727 1.000 0.000 0.000 0.780 -5.169 -0.291 27.574 0.0 0.001 1.0 1.0 0.0 0.0 0.001 255.0 255.0 218.0 247.0 255.0 0.0 154.0 72.0 175.0 140.0 199.0 171.0 255.0 254.0 145.0 3.239 2.047 0.166 1.760 1.430 88.0 0.545000 0.628000 0.876000 0.999000 0.639000 0.440000 1.002000 0.733000 0.0 0.575000 0.0 0.0 201.940 0.862 ... 3.744 3.488 -4.066 -2.066 -3.066 -2.805 -2.805 -0.000 -3.338 0.662 -2.676 87.0 89.0 88.0 0.914000 0.914000 0.0 0.936000 0.936000 0.0 0.968000 0.968000 0.0 0.963000 0.963000 0.000 0.657000 0.657000 0.000 0.483000 0.483000 0.000 0.986000 0.988000 0.000987 0.844000 0.846000 0.000845 0.000000 0.000000 0.0 -1.016000 -1.016000 -0.0 -510.0 510.0 0.0 -65025.0 65025.0 0.0
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
297227 77.865 0.989 0.006 0.004 27.117 -25.148 -19.818 26.619 0.0 0.002 1.0 1.0 0.0 0.0 0.001 255.0 255.0 195.0 255.0 171.0 0.0 118.0 0.0 175.0 194.0 220.0 162.0 255.0 254.0 0.0 -1.256 2.123 26.403 -30.177 -11.785 80.0 0.892000 0.974000 -0.038000 0.954000 0.934000 0.333000 1.005000 0.722000 0.0 0.216000 0.0 0.0 97.226 0.286 ... 3.817 3.634 -35.317 -33.317 -34.317 -3.931 -3.931 -0.000 0.428 4.428 4.856 66.0 68.0 67.0 0.757000 0.757000 0.0 0.822000 0.822000 0.0 0.927000 0.927000 0.0 0.993000 0.993000 0.000 0.060000 0.060000 0.000 0.292000 0.292000 0.000 0.980000 0.980000 0.000000 0.689000 0.693000 0.001382 -0.000001 0.000001 0.0 -1.718000 -1.718000 -0.0 -510.0 510.0 0.0 -65025.0 65025.0 0.0
297228 147.250 1.000 0.000 0.000 3.040 -9.620 -2.790 36.990 0.0 0.000 1.0 1.0 0.0 0.0 0.000 255.0 255.0 255.0 206.0 255.0 255.0 189.0 168.0 175.0 194.0 227.0 254.0 255.0 254.0 242.0 -0.810 1.100 5.540 -5.670 -2.530 117.0 0.932584 0.989263 0.866788 0.730796 1.006490 0.470933 0.993582 0.856815 0.0 2.642143 0.0 0.0 153.420 1.000 ... 3.140 2.280 10.420 12.420 11.420 -4.910 -4.910 -0.000 -5.900 -1.900 -7.800 97.0 99.0 98.0 0.900837 0.900837 0.0 0.646228 0.646228 0.0 0.796856 0.796856 0.0 0.987863 0.987863 0.000 0.457861 0.457861 0.000 0.305589 0.305589 0.000 0.986106 0.986106 0.000000 0.726660 0.726660 0.000000 0.000000 0.000000 0.0 2.328330 2.328330 0.0 -510.0 510.0 0.0 -65025.0 65025.0 0.0
297229 219.495 1.000 0.000 0.000 -31.372 -2.752 -2.153 33.358 1.0 0.001 1.0 0.0 0.0 1.0 0.001 255.0 255.0 255.0 255.0 255.0 0.0 110.0 232.0 175.0 177.0 130.0 165.0 255.0 254.0 153.0 -8.352 1.735 -34.138 2.945 -2.204 74.0 0.852000 0.586000 0.693000 1.003000 0.248000 0.287000 0.997000 0.667000 0.0 1.944000 0.0 0.0 193.696 0.989 ... 3.335 2.670 5.166 7.166 6.166 0.892 0.892 0.000 0.854 0.854 0.000 89.0 89.0 0.0 0.973000 0.973000 0.0 0.965000 0.965000 0.0 0.407000 0.407000 0.0 0.981000 0.981000 0.000 -1.002000 2.998000 1.996 -0.152000 1.848000 0.848 0.990000 0.990000 0.000000 0.878000 0.882000 0.001760 -0.000001 0.000001 0.0 1.779000 1.779000 0.0 -510.0 510.0 0.0 -65025.0 65025.0 0.0
297230 111.054 1.000 0.000 0.000 -13.329 -5.626 3.282 34.638 0.0 0.001 1.0 1.0 0.0 0.0 0.000 255.0 255.0 179.0 253.0 103.0 0.0 112.0 0.0 175.0 194.0 152.0 253.0 255.0 254.0 210.0 -7.390 1.947 -14.144 -1.181 0.357 87.0 0.910000 0.987000 0.702000 0.953000 0.850000 0.324000 0.986000 0.817000 0.0 2.496000 0.0 0.0 63.577 0.995 ... 6.413 8.826 39.631 41.631 40.631 16.873 16.873 0.000 7.637 11.637 19.274 71.0 73.0 72.0 0.817000 0.817000 0.0 0.978000 0.978000 0.0 0.742000 0.742000 0.0 0.936000 0.936000 0.000 0.465000 0.465000 0.000 0.392000 0.392000 0.000 0.986000 0.984000 -0.000985 0.759000 0.761000 0.000760 0.000000 0.000000 0.0 -2.154000 -2.154000 -0.0 -510.0 510.0 0.0 -65025.0 65025.0 0.0
297231 221.947 1.000 0.000 0.000 -34.777 -15.352 8.289 27.163 1.0 0.001 1.0 0.0 0.0 1.0 0.001 255.0 255.0 173.0 145.0 255.0 0.0 124.0 0.0 175.0 186.0 133.0 217.0 255.0 254.0 0.0 -6.255 2.125 -35.505 -2.704 2.327 78.0 0.613000 0.991000 0.863000 0.948000 0.892000 0.412000 0.996000 0.817000 0.0 -1.524000 0.0 0.0 250.406 1.000 ... 4.946 5.892 -7.912 -5.912 -6.912 -4.915 -4.915 -0.000 2.126 2.126 0.000 90.0 90.0 0.0 0.820000 0.820000 0.0 0.985000 0.985000 0.0 0.699000 0.699000 0.0 0.923000 0.923000 0.000 -1.016000 2.984000 1.968 -0.396000 1.604000 0.604 0.992000 0.992000 0.000000 0.856000 0.860000 0.001716 -0.000001 0.000001 0.0 0.250000 0.250000 0.0 -510.0 510.0 0.0 -65025.0 65025.0 0.0

297232 rows × 384 columns

Generation of new features for the X_test_dataframe test dataset

In [28]:
# Load test data
X_test = np.load("test_data.npy")
X_test_enlarged = X_test.copy()
X_test_dataframe_enlarged =  pd.DataFrame(X_test_enlarged)

for i in range(X_test_dataframe_enlarged.shape[1]):
    col_A = X_test_dataframe_enlarged.iloc[:,i]
    col_B = X_test_dataframe_enlarged.iloc[:,48+i]
    X_test_dataframe_enlarged["col_"+str(i)+"minus_col_"+str(48+i)] = col_A - col_B
    X_test_dataframe_enlarged["col_"+str(i)+"plus_col_"+str(48+i)] = col_A + col_B
    X_test_dataframe_enlarged["col_"+str(i)+"dot_col_"+str(48+i)] = col_A * col_B
    
X_test_dataframe_enlarged
Out[28]:
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 ... col_79plus_col_127 col_79dot_col_127 col_80minus_col_128 col_80plus_col_128 col_80dot_col_128 col_81minus_col_129 col_81plus_col_129 col_81dot_col_129 col_82minus_col_130 col_82plus_col_130 col_82dot_col_130 col_83minus_col_131 col_83plus_col_131 col_83dot_col_131 col_84minus_col_132 col_84plus_col_132 col_84dot_col_132 col_85minus_col_133 col_85plus_col_133 col_85dot_col_133 col_86minus_col_134 col_86plus_col_134 col_86dot_col_134 col_87minus_col_135 col_87plus_col_135 col_87dot_col_135 col_88minus_col_136 col_88plus_col_136 col_88dot_col_136 col_89minus_col_137 col_89plus_col_137 col_89dot_col_137 col_90minus_col_138 col_90plus_col_138 col_90dot_col_138 col_91minus_col_139 col_91plus_col_139 col_91dot_col_139 col_92minus_col_140 col_92plus_col_140 col_92dot_col_140 col_93minus_col_141 col_93plus_col_141 col_93dot_col_141 col_94minus_col_142 col_94plus_col_142 col_94dot_col_142 col_95minus_col_143 col_95plus_col_143 col_95dot_col_143
0 170.140 1.000 0.000 0.000 -0.000 -11.020 -3.070 32.180 1.0 0.000 1.0 1.0 0.0 0.0 0.000 255.0 255.0 255.0 238.0 255.0 255.0 220.0 108.0 175.0 193.0 229.0 251.0 255.0 254.0 226.0 2.060 1.070 -1.630 -1.530 -0.780 118.0 0.844216 0.997256 0.929671 0.936884 0.752725 0.928838 0.992638 0.912097 0.0 2.857030 0.0 0.0 161.020 1.000 ... 3.090 2.180 4.650 6.650 5.650 -3.500 -3.500 -0.0 -2.230 1.770 -0.460 103.0 105.0 104.0 0.921770 0.921770 0.0 0.505394 0.505394 0.0 0.896021 0.896021 0.0 0.875605 0.875605 0.0 0.608151 0.608151 0.0 0.849606 0.849606 0.0 0.999814 0.959814 -0.019596 0.786070 0.826070 0.016121 0.000000 0.000000 0.0 -0.694720 -0.694720 -0.000000 -510.0 510.0 0.0 -65025.0 65025.0 0.0
1 135.148 0.994 0.006 0.000 -30.023 -15.470 9.360 44.548 0.0 0.001 1.0 1.0 0.0 0.0 0.001 255.0 255.0 162.0 204.0 221.0 0.0 110.0 56.0 175.0 119.0 130.0 254.0 255.0 254.0 140.0 -3.518 2.012 -34.138 -2.703 1.829 79.0 0.793000 0.848000 0.918000 0.971000 0.794000 0.429000 0.988000 0.821000 0.0 2.214000 0.0 0.0 80.220 0.772 ... 3.941 3.882 -31.289 -29.289 -30.289 11.183 11.183 0.0 -7.499 -3.499 -10.998 80.0 82.0 81.0 0.872000 0.872000 0.0 0.633000 0.633000 0.0 0.937000 0.937000 0.0 0.988000 0.988000 0.0 0.544000 0.544000 0.0 0.164000 0.164000 0.0 0.987000 0.987000 0.000000 0.731000 0.735000 0.001466 -0.000001 0.000001 0.0 -0.172000 -0.172000 -0.000000 -510.0 510.0 0.0 -65025.0 65025.0 0.0
2 194.070 1.000 0.000 0.000 -0.600 -16.610 -1.080 56.870 1.0 0.000 1.0 1.0 0.0 0.0 0.000 255.0 255.0 255.0 230.0 255.0 255.0 220.0 113.0 175.0 194.0 230.0 254.0 255.0 254.0 231.0 1.310 1.040 0.770 0.020 -1.030 127.0 0.827327 0.987265 0.676200 0.852015 0.990901 0.903850 0.947934 0.884600 0.0 3.019360 0.0 0.0 144.760 1.000 ... 3.400 2.800 -2.130 -0.130 -1.130 -0.140 -0.140 -0.0 -0.730 3.270 2.540 62.0 64.0 63.0 0.817503 0.817503 0.0 0.994516 0.994516 0.0 0.661587 0.661587 0.0 0.812022 0.812022 0.0 0.990604 0.990604 0.0 0.803162 0.803162 0.0 0.992972 0.992972 0.000000 0.867663 0.867663 0.000000 0.000000 0.000000 0.0 -107.866124 112.133876 234.726360 -400.0 400.0 0.0 -36975.0 36975.0 0.0
3 168.489 0.729 0.000 0.271 33.690 -22.828 -14.179 52.387 0.0 0.006 1.0 1.0 0.0 0.0 0.003 255.0 255.0 178.0 250.0 255.0 0.0 115.0 0.0 175.0 21.0 132.0 250.0 255.0 254.0 54.0 -5.432 3.867 39.572 -10.635 -9.740 73.0 0.804000 0.626000 0.946000 0.985000 0.606000 0.334000 0.948000 0.751000 0.0 1.076000 0.0 0.0 224.876 1.000 ... 4.946 5.892 -8.837 -6.837 -7.837 3.301 3.301 0.0 -2.663 1.337 -1.326 90.0 92.0 91.0 0.861000 0.861000 0.0 0.988000 0.988000 0.0 0.881000 0.881000 0.0 0.922000 0.922000 0.0 0.930000 0.930000 0.0 0.257000 0.257000 0.0 0.988000 0.994000 0.002973 0.831000 0.837000 0.002502 0.000000 0.000000 0.0 -3.650000 -3.650000 -0.000000 -510.0 510.0 0.0 -65025.0 65025.0 0.0
4 101.788 1.000 0.000 0.000 -6.074 -10.998 -0.036 24.359 0.0 0.001 1.0 1.0 0.0 0.0 0.000 255.0 255.0 200.0 247.0 145.0 0.0 111.0 218.0 175.0 194.0 182.0 11.0 255.0 254.0 190.0 2.879 1.634 -17.699 -0.874 -1.390 84.0 0.844000 0.994000 0.959000 0.992000 0.596000 0.313000 0.993000 0.813000 0.0 -1.825000 0.0 0.0 95.087 0.997 ... 3.336 2.672 -21.891 -19.891 -20.891 1.520 1.520 0.0 -1.708 2.292 0.584 77.0 79.0 78.0 0.853000 0.853000 0.0 0.748000 0.748000 0.0 0.938000 0.938000 0.0 0.985000 0.985000 0.0 0.604000 0.604000 0.0 0.322000 0.322000 0.0 0.987000 0.985000 -0.000986 0.776000 0.778000 0.000777 0.000000 0.000000 0.0 -3.672000 -3.672000 -0.000000 -510.0 510.0 0.0 -65025.0 65025.0 0.0
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
99075 192.900 1.000 0.000 0.000 2.460 -0.190 -1.300 22.700 0.0 0.000 1.0 1.0 0.0 0.0 0.000 255.0 255.0 255.0 255.0 255.0 255.0 220.0 0.0 175.0 194.0 229.0 254.0 255.0 254.0 252.0 12.870 1.380 2.760 1.920 0.870 114.0 0.956066 0.982004 0.565472 0.851993 0.931530 0.875716 0.989203 0.879403 0.0 3.122139 0.0 0.0 120.890 1.000 ... 3.880 3.760 -5.210 -3.210 -4.210 -0.010 -0.010 -0.0 4.590 8.590 13.180 48.0 50.0 49.0 0.892458 0.892458 0.0 0.885539 0.885539 0.0 0.933667 0.933667 0.0 0.992177 0.992177 0.0 0.799719 0.799719 0.0 0.837613 0.837613 0.0 0.996382 0.996382 0.000000 0.905628 0.905628 0.000000 0.000000 0.000000 0.0 -100.407652 105.592348 267.011844 -407.0 407.0 0.0 -38760.0 38760.0 0.0
99076 118.393 0.995 0.005 0.000 20.230 -7.639 -3.885 54.905 1.0 0.048 1.0 1.0 0.0 0.0 0.022 255.0 255.0 225.0 240.0 255.0 0.0 118.0 131.0 175.0 179.0 210.0 0.0 255.0 254.0 238.0 -4.845 2.147 17.396 2.985 2.943 92.0 0.736000 0.955000 0.695000 0.982000 0.986000 0.503000 0.981000 0.835000 0.0 -3.166000 0.0 0.0 138.576 0.998 ... 5.595 7.190 -5.916 -3.916 -4.916 21.586 21.586 0.0 -4.760 -0.760 -5.520 85.0 87.0 86.0 0.522000 0.522000 0.0 0.169000 0.169000 0.0 0.532000 0.532000 0.0 0.973000 0.973000 0.0 0.059000 0.059000 0.0 0.822000 0.822000 0.0 0.971000 1.013000 0.020832 0.559000 0.605000 0.013386 -0.000022 0.000022 0.0 -1.466000 -1.466000 -0.000000 -510.0 510.0 0.0 -65025.0 65025.0 0.0
99077 118.303 1.000 0.000 0.000 3.957 -10.470 0.118 41.823 0.0 0.002 1.0 1.0 0.0 0.0 0.001 255.0 255.0 240.0 255.0 255.0 0.0 194.0 0.0 175.0 168.0 205.0 232.0 255.0 254.0 231.0 2.824 2.578 1.343 -3.707 1.650 94.0 0.861000 0.992000 0.654000 0.664000 0.996000 0.531000 0.992000 0.814000 0.0 1.420000 0.0 0.0 137.821 1.000 ... 3.960 3.920 -1.002 0.998 -0.002 2.647 2.647 0.0 -2.122 1.878 -0.244 87.0 89.0 88.0 0.607000 0.607000 0.0 0.992000 0.992000 0.0 0.706000 0.706000 0.0 0.894000 0.894000 0.0 0.822000 0.822000 0.0 0.543000 0.543000 0.0 0.989000 0.987000 -0.000988 0.791000 0.797000 0.002382 -0.000002 0.000002 0.0 1.098000 1.098000 0.000000 -510.0 510.0 0.0 -65025.0 65025.0 0.0
99078 201.492 1.000 0.000 0.000 4.239 8.468 1.957 19.072 0.0 0.002 1.0 1.0 0.0 0.0 0.001 255.0 255.0 233.0 254.0 192.0 0.0 110.0 255.0 175.0 194.0 169.0 0.0 255.0 254.0 178.0 4.703 1.667 9.097 12.556 0.687 92.0 0.698000 0.986000 0.832000 0.995000 0.803000 0.277000 0.997000 0.799000 0.0 -4.558000 0.0 0.0 125.461 0.749 ... 3.892 3.784 -31.236 -29.236 -30.236 4.658 4.658 0.0 -6.080 -2.080 -8.160 76.0 78.0 77.0 0.894000 0.894000 0.0 0.733000 0.733000 0.0 0.847000 0.847000 0.0 0.994000 0.994000 0.0 0.047000 0.047000 0.0 0.251000 0.251000 0.0 0.975000 0.975000 0.000000 0.676000 0.680000 0.001356 -0.000001 0.000001 0.0 1.193000 1.193000 0.000000 -510.0 510.0 0.0 -65025.0 65025.0 0.0
99079 167.230 1.000 0.000 0.000 -2.900 -2.170 -0.080 40.690 0.0 0.000 1.0 1.0 0.0 0.0 0.000 255.0 255.0 255.0 254.0 255.0 255.0 219.0 255.0 151.0 194.0 213.0 254.0 255.0 254.0 235.0 2.840 1.190 -3.160 1.820 -0.820 121.0 0.741879 0.991252 0.890749 0.965584 0.896677 0.265269 0.993529 0.821524 0.0 2.721282 0.0 0.0 164.140 1.000 ... 3.930 3.860 -1.700 0.300 -0.700 -4.830 -4.830 -0.0 -4.230 -0.230 -4.460 55.0 57.0 56.0 0.718814 0.718814 0.0 0.994363 0.994363 0.0 0.814303 0.814303 0.0 0.889618 0.889618 0.0 0.960352 0.960352 0.0 0.343227 0.343227 0.0 0.991761 0.991761 0.000000 0.816987 0.816987 0.000000 0.000000 0.000000 0.0 -138.918229 145.081771 437.611482 -368.0 368.0 0.0 -28815.0 28815.0 0.0

99080 rows × 384 columns

Voting with enlarged dataframe

Split whole dataset

In [11]:
enlarged_voting_gbc_X_train, enlarged_voting_gbc_X_valid, enlarged_voting_gbc_y_train, enlarged_voting_gbc_y_valid = train_test_split(X_dataframe_enlarged, y, test_size=0.2, random_state=57)

Define hyperparameters and fit the model

In [15]:
%%time

estimators = [
    ('xgbc', XGBClassifier(booster='gbtree', learning_rate=0.3, 
                     max_depth=6, n_estimators=103, 
                     colsample_bynode=1, colsample_bytree=1,
                     subsample=1, gamma=0, 
                     objective='binary:logistic', random_state=57)),
    
    ('lgbc', LGBMClassifier(objective= 'binary', 
                            n_estimators = 2000, random_state=57)),
    
    ('catgbc', CatBoostClassifier(eval_metric= 'Logloss', iterations= 1500, 
                                  learning_rate= 0.1, subsample= 0.8, random_state=57))
]

voting_clf_enlarged = VotingClassifier(estimators=estimators, voting='hard')

voting_clf_enlarged.fit(enlarged_voting_gbc_X_train, enlarged_voting_gbc_y_train)
[15:27:43] WARNING: C:/Users/Administrator/workspace/xgboost-win64_release_1.5.1/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior.
0:	learn: 0.6617314	total: 378ms	remaining: 9m 26s
1:	learn: 0.6362425	total: 772ms	remaining: 9m 37s
2:	learn: 0.6163850	total: 1.14s	remaining: 9m 29s
3:	learn: 0.5992620	total: 1.61s	remaining: 10m 3s
4:	learn: 0.5861312	total: 1.95s	remaining: 9m 43s
5:	learn: 0.5744459	total: 2.36s	remaining: 9m 47s
6:	learn: 0.5628752	total: 2.79s	remaining: 9m 56s
7:	learn: 0.5541860	total: 3.12s	remaining: 9m 40s
8:	learn: 0.5469270	total: 3.51s	remaining: 9m 40s
9:	learn: 0.5389828	total: 3.99s	remaining: 9m 54s
10:	learn: 0.5334995	total: 4.33s	remaining: 9m 45s
11:	learn: 0.5277125	total: 4.75s	remaining: 9m 49s
12:	learn: 0.5230033	total: 5.07s	remaining: 9m 39s
13:	learn: 0.5190566	total: 5.43s	remaining: 9m 36s
14:	learn: 0.5139382	total: 5.85s	remaining: 9m 39s
15:	learn: 0.5099748	total: 6.25s	remaining: 9m 40s
16:	learn: 0.5066705	total: 6.59s	remaining: 9m 34s
17:	learn: 0.5035907	total: 7.01s	remaining: 9m 37s
18:	learn: 0.5008514	total: 7.39s	remaining: 9m 35s
19:	learn: 0.4973629	total: 7.94s	remaining: 9m 47s
20:	learn: 0.4947953	total: 8.29s	remaining: 9m 43s
21:	learn: 0.4922816	total: 8.72s	remaining: 9m 45s
22:	learn: 0.4898955	total: 9.13s	remaining: 9m 46s
23:	learn: 0.4878150	total: 9.5s	remaining: 9m 44s
24:	learn: 0.4855598	total: 9.88s	remaining: 9m 43s
25:	learn: 0.4836790	total: 10.2s	remaining: 9m 39s
26:	learn: 0.4814123	total: 10.7s	remaining: 9m 42s
27:	learn: 0.4794607	total: 11.1s	remaining: 9m 40s
28:	learn: 0.4776959	total: 11.4s	remaining: 9m 35s
29:	learn: 0.4762536	total: 11.7s	remaining: 9m 33s
30:	learn: 0.4743515	total: 12.1s	remaining: 9m 34s
31:	learn: 0.4729333	total: 12.5s	remaining: 9m 32s
32:	learn: 0.4712289	total: 13s	remaining: 9m 37s
33:	learn: 0.4699287	total: 13.3s	remaining: 9m 33s
34:	learn: 0.4683515	total: 13.8s	remaining: 9m 35s
35:	learn: 0.4671968	total: 14.1s	remaining: 9m 32s
36:	learn: 0.4660086	total: 14.4s	remaining: 9m 28s
37:	learn: 0.4648738	total: 14.7s	remaining: 9m 25s
38:	learn: 0.4637597	total: 15.1s	remaining: 9m 25s
39:	learn: 0.4624308	total: 15.5s	remaining: 9m 25s
40:	learn: 0.4613676	total: 15.9s	remaining: 9m 25s
41:	learn: 0.4604727	total: 16.4s	remaining: 9m 27s
42:	learn: 0.4596207	total: 16.6s	remaining: 9m 23s
43:	learn: 0.4587858	total: 17s	remaining: 9m 21s
44:	learn: 0.4580506	total: 17.3s	remaining: 9m 19s
45:	learn: 0.4571579	total: 17.7s	remaining: 9m 19s
46:	learn: 0.4563833	total: 17.9s	remaining: 9m 14s
47:	learn: 0.4557336	total: 18.2s	remaining: 9m 11s
48:	learn: 0.4550757	total: 18.5s	remaining: 9m 9s
49:	learn: 0.4542985	total: 18.9s	remaining: 9m 8s
50:	learn: 0.4537192	total: 19.3s	remaining: 9m 7s
51:	learn: 0.4529708	total: 19.7s	remaining: 9m 8s
52:	learn: 0.4523110	total: 20.1s	remaining: 9m 7s
53:	learn: 0.4517893	total: 20.4s	remaining: 9m 6s
54:	learn: 0.4509870	total: 20.8s	remaining: 9m 7s
55:	learn: 0.4502360	total: 21.3s	remaining: 9m 9s
56:	learn: 0.4495959	total: 21.7s	remaining: 9m 8s
57:	learn: 0.4487629	total: 22.1s	remaining: 9m 8s
58:	learn: 0.4483233	total: 22.4s	remaining: 9m 6s
59:	learn: 0.4478492	total: 22.8s	remaining: 9m 6s
60:	learn: 0.4472807	total: 23.3s	remaining: 9m 10s
61:	learn: 0.4468000	total: 23.7s	remaining: 9m 10s
62:	learn: 0.4463235	total: 24.1s	remaining: 9m 8s
63:	learn: 0.4457082	total: 24.7s	remaining: 9m 14s
64:	learn: 0.4452480	total: 25.2s	remaining: 9m 17s
65:	learn: 0.4448033	total: 25.8s	remaining: 9m 21s
66:	learn: 0.4442737	total: 26.4s	remaining: 9m 25s
67:	learn: 0.4438673	total: 26.9s	remaining: 9m 26s
68:	learn: 0.4434484	total: 27.3s	remaining: 9m 26s
69:	learn: 0.4430063	total: 27.8s	remaining: 9m 28s
70:	learn: 0.4426705	total: 28.2s	remaining: 9m 27s
71:	learn: 0.4423014	total: 28.6s	remaining: 9m 26s
72:	learn: 0.4416339	total: 28.9s	remaining: 9m 25s
73:	learn: 0.4412239	total: 29.3s	remaining: 9m 25s
74:	learn: 0.4406743	total: 29.8s	remaining: 9m 25s
75:	learn: 0.4402141	total: 30.1s	remaining: 9m 23s
76:	learn: 0.4399204	total: 30.4s	remaining: 9m 22s
77:	learn: 0.4396434	total: 30.7s	remaining: 9m 19s
78:	learn: 0.4393051	total: 31s	remaining: 9m 17s
79:	learn: 0.4389638	total: 31.4s	remaining: 9m 17s
80:	learn: 0.4385119	total: 31.8s	remaining: 9m 17s
81:	learn: 0.4381456	total: 32.2s	remaining: 9m 16s
82:	learn: 0.4378254	total: 32.5s	remaining: 9m 14s
83:	learn: 0.4376173	total: 32.8s	remaining: 9m 12s
84:	learn: 0.4373675	total: 33.1s	remaining: 9m 10s
85:	learn: 0.4369760	total: 33.5s	remaining: 9m 10s
86:	learn: 0.4366671	total: 33.9s	remaining: 9m 10s
87:	learn: 0.4363562	total: 34.4s	remaining: 9m 12s
88:	learn: 0.4360799	total: 34.8s	remaining: 9m 11s
89:	learn: 0.4358504	total: 35.2s	remaining: 9m 11s
90:	learn: 0.4356320	total: 35.5s	remaining: 9m 10s
91:	learn: 0.4353532	total: 36s	remaining: 9m 10s
92:	learn: 0.4351282	total: 36.4s	remaining: 9m 10s
93:	learn: 0.4348915	total: 36.8s	remaining: 9m 10s
94:	learn: 0.4347049	total: 37.2s	remaining: 9m 9s
95:	learn: 0.4344634	total: 37.6s	remaining: 9m 10s
96:	learn: 0.4341321	total: 38.1s	remaining: 9m 10s
97:	learn: 0.4338610	total: 38.5s	remaining: 9m 10s
98:	learn: 0.4335780	total: 38.9s	remaining: 9m 10s
99:	learn: 0.4332109	total: 39.4s	remaining: 9m 10s
100:	learn: 0.4329642	total: 39.8s	remaining: 9m 11s
101:	learn: 0.4326294	total: 40.3s	remaining: 9m 12s
102:	learn: 0.4324315	total: 40.8s	remaining: 9m 13s
103:	learn: 0.4322047	total: 41.3s	remaining: 9m 13s
104:	learn: 0.4319126	total: 41.7s	remaining: 9m 13s
105:	learn: 0.4316796	total: 42s	remaining: 9m 12s
106:	learn: 0.4314852	total: 42.4s	remaining: 9m 12s
107:	learn: 0.4312105	total: 42.8s	remaining: 9m 12s
108:	learn: 0.4309746	total: 43.1s	remaining: 9m 10s
109:	learn: 0.4307194	total: 43.5s	remaining: 9m 9s
110:	learn: 0.4305001	total: 43.9s	remaining: 9m 9s
111:	learn: 0.4302972	total: 44.3s	remaining: 9m 8s
112:	learn: 0.4300873	total: 44.6s	remaining: 9m 7s
113:	learn: 0.4298912	total: 45s	remaining: 9m 6s
114:	learn: 0.4296964	total: 45.4s	remaining: 9m 6s
115:	learn: 0.4294207	total: 45.8s	remaining: 9m 6s
116:	learn: 0.4292371	total: 46.1s	remaining: 9m 5s
117:	learn: 0.4290317	total: 46.6s	remaining: 9m 5s
118:	learn: 0.4286612	total: 47s	remaining: 9m 5s
119:	learn: 0.4284146	total: 47.5s	remaining: 9m 5s
120:	learn: 0.4282692	total: 47.8s	remaining: 9m 5s
121:	learn: 0.4281188	total: 48.2s	remaining: 9m 4s
122:	learn: 0.4279645	total: 48.5s	remaining: 9m 3s
123:	learn: 0.4277801	total: 48.9s	remaining: 9m 2s
124:	learn: 0.4275894	total: 49.3s	remaining: 9m 1s
125:	learn: 0.4274008	total: 49.6s	remaining: 9m 1s
126:	learn: 0.4272393	total: 50s	remaining: 9m
127:	learn: 0.4270103	total: 50.4s	remaining: 8m 59s
128:	learn: 0.4267832	total: 50.8s	remaining: 8m 59s
129:	learn: 0.4266090	total: 51.2s	remaining: 8m 59s
130:	learn: 0.4264045	total: 51.5s	remaining: 8m 58s
131:	learn: 0.4261981	total: 51.8s	remaining: 8m 57s
132:	learn: 0.4259818	total: 52.2s	remaining: 8m 56s
133:	learn: 0.4258297	total: 52.6s	remaining: 8m 56s
134:	learn: 0.4255988	total: 53s	remaining: 8m 55s
135:	learn: 0.4253949	total: 53.4s	remaining: 8m 55s
136:	learn: 0.4251806	total: 53.9s	remaining: 8m 56s
137:	learn: 0.4249959	total: 54.4s	remaining: 8m 56s
138:	learn: 0.4248304	total: 54.8s	remaining: 8m 56s
139:	learn: 0.4246144	total: 55.3s	remaining: 8m 56s
140:	learn: 0.4244403	total: 55.7s	remaining: 8m 57s
141:	learn: 0.4242466	total: 56.2s	remaining: 8m 57s
142:	learn: 0.4241013	total: 56.6s	remaining: 8m 56s
143:	learn: 0.4239673	total: 57s	remaining: 8m 56s
144:	learn: 0.4237115	total: 57.4s	remaining: 8m 56s
145:	learn: 0.4235180	total: 57.8s	remaining: 8m 55s
146:	learn: 0.4233203	total: 58.2s	remaining: 8m 55s
147:	learn: 0.4231332	total: 58.6s	remaining: 8m 55s
148:	learn: 0.4229476	total: 59s	remaining: 8m 54s
149:	learn: 0.4227403	total: 59.4s	remaining: 8m 54s
150:	learn: 0.4225534	total: 59.9s	remaining: 8m 54s
151:	learn: 0.4223657	total: 1m	remaining: 8m 54s
152:	learn: 0.4221675	total: 1m	remaining: 8m 54s
153:	learn: 0.4219942	total: 1m 1s	remaining: 8m 53s
154:	learn: 0.4218227	total: 1m 1s	remaining: 8m 52s
155:	learn: 0.4215723	total: 1m 1s	remaining: 8m 52s
156:	learn: 0.4214042	total: 1m 2s	remaining: 8m 51s
157:	learn: 0.4212629	total: 1m 2s	remaining: 8m 50s
158:	learn: 0.4211334	total: 1m 2s	remaining: 8m 49s
159:	learn: 0.4209981	total: 1m 3s	remaining: 8m 48s
160:	learn: 0.4208118	total: 1m 3s	remaining: 8m 48s
161:	learn: 0.4206860	total: 1m 3s	remaining: 8m 47s
162:	learn: 0.4205417	total: 1m 4s	remaining: 8m 46s
163:	learn: 0.4203711	total: 1m 4s	remaining: 8m 45s
164:	learn: 0.4202379	total: 1m 4s	remaining: 8m 45s
165:	learn: 0.4201213	total: 1m 5s	remaining: 8m 45s
166:	learn: 0.4199984	total: 1m 5s	remaining: 8m 43s
167:	learn: 0.4197993	total: 1m 5s	remaining: 8m 43s
168:	learn: 0.4196187	total: 1m 6s	remaining: 8m 42s
169:	learn: 0.4194897	total: 1m 6s	remaining: 8m 41s
170:	learn: 0.4192968	total: 1m 6s	remaining: 8m 40s
171:	learn: 0.4191932	total: 1m 7s	remaining: 8m 39s
172:	learn: 0.4190393	total: 1m 7s	remaining: 8m 39s
173:	learn: 0.4188486	total: 1m 8s	remaining: 8m 38s
174:	learn: 0.4187131	total: 1m 8s	remaining: 8m 37s
175:	learn: 0.4185781	total: 1m 8s	remaining: 8m 36s
176:	learn: 0.4184569	total: 1m 9s	remaining: 8m 36s
177:	learn: 0.4182970	total: 1m 9s	remaining: 8m 35s
178:	learn: 0.4181846	total: 1m 9s	remaining: 8m 34s
179:	learn: 0.4180660	total: 1m 9s	remaining: 8m 33s
180:	learn: 0.4179241	total: 1m 10s	remaining: 8m 33s
181:	learn: 0.4177989	total: 1m 10s	remaining: 8m 32s
182:	learn: 0.4176626	total: 1m 11s	remaining: 8m 32s
183:	learn: 0.4175324	total: 1m 11s	remaining: 8m 33s
184:	learn: 0.4173594	total: 1m 12s	remaining: 8m 32s
185:	learn: 0.4171998	total: 1m 12s	remaining: 8m 32s
186:	learn: 0.4170521	total: 1m 13s	remaining: 8m 32s
187:	learn: 0.4169258	total: 1m 13s	remaining: 8m 32s
188:	learn: 0.4168282	total: 1m 13s	remaining: 8m 32s
189:	learn: 0.4167106	total: 1m 14s	remaining: 8m 32s
190:	learn: 0.4165556	total: 1m 14s	remaining: 8m 32s
191:	learn: 0.4164326	total: 1m 15s	remaining: 8m 31s
192:	learn: 0.4162925	total: 1m 15s	remaining: 8m 31s
193:	learn: 0.4161692	total: 1m 15s	remaining: 8m 30s
194:	learn: 0.4160569	total: 1m 16s	remaining: 8m 30s
195:	learn: 0.4158500	total: 1m 16s	remaining: 8m 30s
196:	learn: 0.4157648	total: 1m 17s	remaining: 8m 29s
197:	learn: 0.4156695	total: 1m 17s	remaining: 8m 28s
198:	learn: 0.4155702	total: 1m 17s	remaining: 8m 27s
199:	learn: 0.4154366	total: 1m 18s	remaining: 8m 27s
200:	learn: 0.4153555	total: 1m 18s	remaining: 8m 26s
201:	learn: 0.4152479	total: 1m 18s	remaining: 8m 25s
202:	learn: 0.4151599	total: 1m 19s	remaining: 8m 25s
203:	learn: 0.4150225	total: 1m 19s	remaining: 8m 24s
204:	learn: 0.4149018	total: 1m 19s	remaining: 8m 23s
205:	learn: 0.4147695	total: 1m 20s	remaining: 8m 23s
206:	learn: 0.4146552	total: 1m 20s	remaining: 8m 23s
207:	learn: 0.4145289	total: 1m 21s	remaining: 8m 23s
208:	learn: 0.4144353	total: 1m 21s	remaining: 8m 22s
209:	learn: 0.4143259	total: 1m 21s	remaining: 8m 21s
210:	learn: 0.4142157	total: 1m 22s	remaining: 8m 21s
211:	learn: 0.4141080	total: 1m 22s	remaining: 8m 20s
212:	learn: 0.4139982	total: 1m 22s	remaining: 8m 20s
213:	learn: 0.4138519	total: 1m 23s	remaining: 8m 20s
214:	learn: 0.4137380	total: 1m 23s	remaining: 8m 20s
215:	learn: 0.4136149	total: 1m 24s	remaining: 8m 19s
216:	learn: 0.4135141	total: 1m 24s	remaining: 8m 19s
217:	learn: 0.4133815	total: 1m 24s	remaining: 8m 19s
218:	learn: 0.4132523	total: 1m 25s	remaining: 8m 19s
219:	learn: 0.4131438	total: 1m 25s	remaining: 8m 19s
220:	learn: 0.4130059	total: 1m 26s	remaining: 8m 19s
221:	learn: 0.4129206	total: 1m 26s	remaining: 8m 18s
222:	learn: 0.4128140	total: 1m 27s	remaining: 8m 18s
223:	learn: 0.4126981	total: 1m 27s	remaining: 8m 18s
224:	learn: 0.4125928	total: 1m 27s	remaining: 8m 17s
225:	learn: 0.4124945	total: 1m 28s	remaining: 8m 17s
226:	learn: 0.4123882	total: 1m 28s	remaining: 8m 16s
227:	learn: 0.4123025	total: 1m 28s	remaining: 8m 16s
228:	learn: 0.4122024	total: 1m 29s	remaining: 8m 15s
229:	learn: 0.4120779	total: 1m 29s	remaining: 8m 15s
230:	learn: 0.4119543	total: 1m 30s	remaining: 8m 14s
231:	learn: 0.4118362	total: 1m 30s	remaining: 8m 14s
232:	learn: 0.4117415	total: 1m 30s	remaining: 8m 14s
233:	learn: 0.4116489	total: 1m 31s	remaining: 8m 13s
234:	learn: 0.4115494	total: 1m 31s	remaining: 8m 13s
235:	learn: 0.4114702	total: 1m 32s	remaining: 8m 12s
236:	learn: 0.4113496	total: 1m 32s	remaining: 8m 12s
237:	learn: 0.4112443	total: 1m 32s	remaining: 8m 11s
238:	learn: 0.4111562	total: 1m 33s	remaining: 8m 11s
239:	learn: 0.4110467	total: 1m 33s	remaining: 8m 10s
240:	learn: 0.4109414	total: 1m 33s	remaining: 8m 9s
241:	learn: 0.4108426	total: 1m 34s	remaining: 8m 9s
242:	learn: 0.4107555	total: 1m 34s	remaining: 8m 9s
243:	learn: 0.4106527	total: 1m 34s	remaining: 8m 8s
244:	learn: 0.4105681	total: 1m 35s	remaining: 8m 8s
245:	learn: 0.4104581	total: 1m 35s	remaining: 8m 7s
246:	learn: 0.4103239	total: 1m 36s	remaining: 8m 7s
247:	learn: 0.4102188	total: 1m 36s	remaining: 8m 7s
248:	learn: 0.4101191	total: 1m 36s	remaining: 8m 6s
249:	learn: 0.4100317	total: 1m 37s	remaining: 8m 5s
250:	learn: 0.4099478	total: 1m 37s	remaining: 8m 5s
251:	learn: 0.4098635	total: 1m 37s	remaining: 8m 4s
252:	learn: 0.4097717	total: 1m 38s	remaining: 8m 4s
253:	learn: 0.4096244	total: 1m 38s	remaining: 8m 4s
254:	learn: 0.4095367	total: 1m 38s	remaining: 8m 3s
255:	learn: 0.4094383	total: 1m 39s	remaining: 8m 2s
256:	learn: 0.4093247	total: 1m 39s	remaining: 8m 2s
257:	learn: 0.4092416	total: 1m 40s	remaining: 8m 1s
258:	learn: 0.4091629	total: 1m 40s	remaining: 8m 1s
259:	learn: 0.4090664	total: 1m 40s	remaining: 8m
260:	learn: 0.4089424	total: 1m 41s	remaining: 8m
261:	learn: 0.4088588	total: 1m 41s	remaining: 7m 59s
262:	learn: 0.4087900	total: 1m 41s	remaining: 7m 58s
263:	learn: 0.4086928	total: 1m 42s	remaining: 7m 58s
264:	learn: 0.4086109	total: 1m 42s	remaining: 7m 57s
265:	learn: 0.4085190	total: 1m 42s	remaining: 7m 57s
266:	learn: 0.4084522	total: 1m 43s	remaining: 7m 56s
267:	learn: 0.4083614	total: 1m 43s	remaining: 7m 56s
268:	learn: 0.4082689	total: 1m 44s	remaining: 7m 56s
269:	learn: 0.4082011	total: 1m 44s	remaining: 7m 55s
270:	learn: 0.4081027	total: 1m 44s	remaining: 7m 55s
271:	learn: 0.4080208	total: 1m 45s	remaining: 7m 55s
272:	learn: 0.4079247	total: 1m 45s	remaining: 7m 54s
273:	learn: 0.4078346	total: 1m 45s	remaining: 7m 53s
274:	learn: 0.4077586	total: 1m 46s	remaining: 7m 53s
275:	learn: 0.4076534	total: 1m 46s	remaining: 7m 52s
276:	learn: 0.4075852	total: 1m 46s	remaining: 7m 51s
277:	learn: 0.4075206	total: 1m 47s	remaining: 7m 51s
278:	learn: 0.4074389	total: 1m 47s	remaining: 7m 50s
279:	learn: 0.4073451	total: 1m 47s	remaining: 7m 50s
280:	learn: 0.4072478	total: 1m 48s	remaining: 7m 50s
281:	learn: 0.4071658	total: 1m 48s	remaining: 7m 49s
282:	learn: 0.4070822	total: 1m 49s	remaining: 7m 48s
283:	learn: 0.4069850	total: 1m 49s	remaining: 7m 48s
284:	learn: 0.4069013	total: 1m 49s	remaining: 7m 47s
285:	learn: 0.4068177	total: 1m 50s	remaining: 7m 47s
286:	learn: 0.4067293	total: 1m 50s	remaining: 7m 46s
287:	learn: 0.4066574	total: 1m 50s	remaining: 7m 46s
288:	learn: 0.4065722	total: 1m 51s	remaining: 7m 45s
289:	learn: 0.4064752	total: 1m 51s	remaining: 7m 44s
290:	learn: 0.4063874	total: 1m 51s	remaining: 7m 44s
291:	learn: 0.4062844	total: 1m 52s	remaining: 7m 43s
292:	learn: 0.4062172	total: 1m 52s	remaining: 7m 43s
293:	learn: 0.4061343	total: 1m 52s	remaining: 7m 42s
294:	learn: 0.4060144	total: 1m 53s	remaining: 7m 42s
295:	learn: 0.4059368	total: 1m 53s	remaining: 7m 42s
296:	learn: 0.4058570	total: 1m 53s	remaining: 7m 41s
297:	learn: 0.4057960	total: 1m 54s	remaining: 7m 40s
298:	learn: 0.4056975	total: 1m 54s	remaining: 7m 40s
299:	learn: 0.4055972	total: 1m 55s	remaining: 7m 40s
300:	learn: 0.4055120	total: 1m 55s	remaining: 7m 39s
301:	learn: 0.4054299	total: 1m 55s	remaining: 7m 39s
302:	learn: 0.4053490	total: 1m 56s	remaining: 7m 39s
303:	learn: 0.4052543	total: 1m 56s	remaining: 7m 38s
304:	learn: 0.4051725	total: 1m 56s	remaining: 7m 38s
305:	learn: 0.4050946	total: 1m 57s	remaining: 7m 37s
306:	learn: 0.4050167	total: 1m 57s	remaining: 7m 37s
307:	learn: 0.4049465	total: 1m 58s	remaining: 7m 36s
308:	learn: 0.4048753	total: 1m 58s	remaining: 7m 36s
309:	learn: 0.4047903	total: 1m 58s	remaining: 7m 35s
310:	learn: 0.4046949	total: 1m 59s	remaining: 7m 35s
311:	learn: 0.4045981	total: 1m 59s	remaining: 7m 35s
312:	learn: 0.4045099	total: 1m 59s	remaining: 7m 35s
313:	learn: 0.4044367	total: 2m	remaining: 7m 34s
314:	learn: 0.4043535	total: 2m	remaining: 7m 34s
315:	learn: 0.4042604	total: 2m 1s	remaining: 7m 33s
316:	learn: 0.4041536	total: 2m 1s	remaining: 7m 33s
317:	learn: 0.4040494	total: 2m 2s	remaining: 7m 33s
318:	learn: 0.4039592	total: 2m 2s	remaining: 7m 33s
319:	learn: 0.4038831	total: 2m 2s	remaining: 7m 32s
320:	learn: 0.4037940	total: 2m 3s	remaining: 7m 32s
321:	learn: 0.4037474	total: 2m 3s	remaining: 7m 31s
322:	learn: 0.4036804	total: 2m 3s	remaining: 7m 30s
323:	learn: 0.4036018	total: 2m 4s	remaining: 7m 30s
324:	learn: 0.4035057	total: 2m 4s	remaining: 7m 29s
325:	learn: 0.4033937	total: 2m 4s	remaining: 7m 29s
326:	learn: 0.4032909	total: 2m 5s	remaining: 7m 28s
327:	learn: 0.4032217	total: 2m 5s	remaining: 7m 28s
328:	learn: 0.4031525	total: 2m 5s	remaining: 7m 27s
329:	learn: 0.4030688	total: 2m 6s	remaining: 7m 27s
330:	learn: 0.4030085	total: 2m 6s	remaining: 7m 26s
331:	learn: 0.4029527	total: 2m 6s	remaining: 7m 25s
332:	learn: 0.4029037	total: 2m 7s	remaining: 7m 25s
333:	learn: 0.4028231	total: 2m 7s	remaining: 7m 24s
334:	learn: 0.4027539	total: 2m 7s	remaining: 7m 24s
335:	learn: 0.4026572	total: 2m 8s	remaining: 7m 23s
336:	learn: 0.4025768	total: 2m 8s	remaining: 7m 23s
337:	learn: 0.4024819	total: 2m 8s	remaining: 7m 23s
338:	learn: 0.4024026	total: 2m 9s	remaining: 7m 23s
339:	learn: 0.4023284	total: 2m 9s	remaining: 7m 22s
340:	learn: 0.4022481	total: 2m 10s	remaining: 7m 22s
341:	learn: 0.4021660	total: 2m 10s	remaining: 7m 21s
342:	learn: 0.4020838	total: 2m 10s	remaining: 7m 21s
343:	learn: 0.4019836	total: 2m 11s	remaining: 7m 20s
344:	learn: 0.4019011	total: 2m 11s	remaining: 7m 20s
345:	learn: 0.4018331	total: 2m 11s	remaining: 7m 19s
346:	learn: 0.4017596	total: 2m 12s	remaining: 7m 18s
347:	learn: 0.4016722	total: 2m 12s	remaining: 7m 18s
348:	learn: 0.4016001	total: 2m 12s	remaining: 7m 17s
349:	learn: 0.4015340	total: 2m 13s	remaining: 7m 17s
350:	learn: 0.4014622	total: 2m 13s	remaining: 7m 16s
351:	learn: 0.4014033	total: 2m 13s	remaining: 7m 16s
352:	learn: 0.4013019	total: 2m 14s	remaining: 7m 16s
353:	learn: 0.4012301	total: 2m 14s	remaining: 7m 15s
354:	learn: 0.4011676	total: 2m 14s	remaining: 7m 15s
355:	learn: 0.4011118	total: 2m 15s	remaining: 7m 14s
356:	learn: 0.4010374	total: 2m 15s	remaining: 7m 14s
357:	learn: 0.4009729	total: 2m 16s	remaining: 7m 14s
358:	learn: 0.4009098	total: 2m 16s	remaining: 7m 13s
359:	learn: 0.4008418	total: 2m 16s	remaining: 7m 13s
360:	learn: 0.4007535	total: 2m 17s	remaining: 7m 12s
361:	learn: 0.4006851	total: 2m 17s	remaining: 7m 12s
362:	learn: 0.4006226	total: 2m 17s	remaining: 7m 11s
363:	learn: 0.4005401	total: 2m 18s	remaining: 7m 11s
364:	learn: 0.4004777	total: 2m 18s	remaining: 7m 10s
365:	learn: 0.4004070	total: 2m 18s	remaining: 7m 10s
366:	learn: 0.4003488	total: 2m 19s	remaining: 7m 9s
367:	learn: 0.4002848	total: 2m 19s	remaining: 7m 9s
368:	learn: 0.4001850	total: 2m 19s	remaining: 7m 8s
369:	learn: 0.4000968	total: 2m 20s	remaining: 7m 8s
370:	learn: 0.4000173	total: 2m 20s	remaining: 7m 8s
371:	learn: 0.3999598	total: 2m 20s	remaining: 7m 7s
372:	learn: 0.3999013	total: 2m 21s	remaining: 7m 6s
373:	learn: 0.3998347	total: 2m 21s	remaining: 7m 6s
374:	learn: 0.3997613	total: 2m 21s	remaining: 7m 5s
375:	learn: 0.3996785	total: 2m 22s	remaining: 7m 5s
376:	learn: 0.3995980	total: 2m 22s	remaining: 7m 4s
377:	learn: 0.3995402	total: 2m 23s	remaining: 7m 4s
378:	learn: 0.3994713	total: 2m 23s	remaining: 7m 4s
379:	learn: 0.3993930	total: 2m 23s	remaining: 7m 3s
380:	learn: 0.3993110	total: 2m 24s	remaining: 7m 3s
381:	learn: 0.3992357	total: 2m 24s	remaining: 7m 2s
382:	learn: 0.3991553	total: 2m 24s	remaining: 7m 2s
383:	learn: 0.3990859	total: 2m 25s	remaining: 7m 2s
384:	learn: 0.3990175	total: 2m 25s	remaining: 7m 1s
385:	learn: 0.3989610	total: 2m 25s	remaining: 7m
386:	learn: 0.3988765	total: 2m 26s	remaining: 7m
387:	learn: 0.3988224	total: 2m 26s	remaining: 6m 59s
388:	learn: 0.3987468	total: 2m 26s	remaining: 6m 59s
389:	learn: 0.3987009	total: 2m 27s	remaining: 6m 59s
390:	learn: 0.3986210	total: 2m 27s	remaining: 6m 58s
391:	learn: 0.3985508	total: 2m 28s	remaining: 6m 58s
392:	learn: 0.3984856	total: 2m 28s	remaining: 6m 57s
393:	learn: 0.3984380	total: 2m 28s	remaining: 6m 57s
394:	learn: 0.3983717	total: 2m 29s	remaining: 6m 56s
395:	learn: 0.3982954	total: 2m 29s	remaining: 6m 56s
396:	learn: 0.3982139	total: 2m 29s	remaining: 6m 56s
397:	learn: 0.3981478	total: 2m 30s	remaining: 6m 55s
398:	learn: 0.3980688	total: 2m 30s	remaining: 6m 55s
399:	learn: 0.3979999	total: 2m 30s	remaining: 6m 55s
400:	learn: 0.3979254	total: 2m 31s	remaining: 6m 54s
401:	learn: 0.3978793	total: 2m 31s	remaining: 6m 54s
402:	learn: 0.3977946	total: 2m 32s	remaining: 6m 54s
403:	learn: 0.3977325	total: 2m 32s	remaining: 6m 53s
404:	learn: 0.3976588	total: 2m 32s	remaining: 6m 53s
405:	learn: 0.3976072	total: 2m 33s	remaining: 6m 52s
406:	learn: 0.3975395	total: 2m 33s	remaining: 6m 52s
407:	learn: 0.3974676	total: 2m 33s	remaining: 6m 51s
408:	learn: 0.3974049	total: 2m 34s	remaining: 6m 51s
409:	learn: 0.3973342	total: 2m 34s	remaining: 6m 50s
410:	learn: 0.3972413	total: 2m 34s	remaining: 6m 50s
411:	learn: 0.3971597	total: 2m 35s	remaining: 6m 50s
412:	learn: 0.3970828	total: 2m 35s	remaining: 6m 49s
413:	learn: 0.3970118	total: 2m 35s	remaining: 6m 49s
414:	learn: 0.3969485	total: 2m 36s	remaining: 6m 48s
415:	learn: 0.3968713	total: 2m 36s	remaining: 6m 48s
416:	learn: 0.3967872	total: 2m 37s	remaining: 6m 47s
417:	learn: 0.3967263	total: 2m 37s	remaining: 6m 47s
418:	learn: 0.3966648	total: 2m 37s	remaining: 6m 46s
419:	learn: 0.3965925	total: 2m 38s	remaining: 6m 46s
420:	learn: 0.3965136	total: 2m 38s	remaining: 6m 46s
421:	learn: 0.3964548	total: 2m 38s	remaining: 6m 45s
422:	learn: 0.3963952	total: 2m 39s	remaining: 6m 45s
423:	learn: 0.3963209	total: 2m 39s	remaining: 6m 44s
424:	learn: 0.3962659	total: 2m 39s	remaining: 6m 44s
425:	learn: 0.3962115	total: 2m 40s	remaining: 6m 43s
426:	learn: 0.3961438	total: 2m 40s	remaining: 6m 43s
427:	learn: 0.3960902	total: 2m 40s	remaining: 6m 42s
428:	learn: 0.3960226	total: 2m 41s	remaining: 6m 42s
429:	learn: 0.3959697	total: 2m 41s	remaining: 6m 41s
430:	learn: 0.3958874	total: 2m 41s	remaining: 6m 41s
431:	learn: 0.3958097	total: 2m 42s	remaining: 6m 41s
432:	learn: 0.3957561	total: 2m 42s	remaining: 6m 40s
433:	learn: 0.3957029	total: 2m 42s	remaining: 6m 40s
434:	learn: 0.3956436	total: 2m 43s	remaining: 6m 39s
435:	learn: 0.3955768	total: 2m 43s	remaining: 6m 39s
436:	learn: 0.3955049	total: 2m 43s	remaining: 6m 38s
437:	learn: 0.3954352	total: 2m 44s	remaining: 6m 38s
438:	learn: 0.3953539	total: 2m 44s	remaining: 6m 37s
439:	learn: 0.3952777	total: 2m 44s	remaining: 6m 37s
440:	learn: 0.3952028	total: 2m 45s	remaining: 6m 37s
441:	learn: 0.3951242	total: 2m 45s	remaining: 6m 36s
442:	learn: 0.3950612	total: 2m 46s	remaining: 6m 36s
443:	learn: 0.3950077	total: 2m 46s	remaining: 6m 36s
444:	learn: 0.3949396	total: 2m 46s	remaining: 6m 35s
445:	learn: 0.3948861	total: 2m 47s	remaining: 6m 35s
446:	learn: 0.3948162	total: 2m 47s	remaining: 6m 34s
447:	learn: 0.3947489	total: 2m 48s	remaining: 6m 34s
448:	learn: 0.3946829	total: 2m 48s	remaining: 6m 34s
449:	learn: 0.3946194	total: 2m 48s	remaining: 6m 33s
450:	learn: 0.3945876	total: 2m 48s	remaining: 6m 33s
451:	learn: 0.3945072	total: 2m 49s	remaining: 6m 32s
452:	learn: 0.3944370	total: 2m 49s	remaining: 6m 32s
453:	learn: 0.3943552	total: 2m 50s	remaining: 6m 32s
454:	learn: 0.3942895	total: 2m 50s	remaining: 6m 31s
455:	learn: 0.3942215	total: 2m 50s	remaining: 6m 31s
456:	learn: 0.3941529	total: 2m 51s	remaining: 6m 30s
457:	learn: 0.3940992	total: 2m 51s	remaining: 6m 30s
458:	learn: 0.3940305	total: 2m 51s	remaining: 6m 30s
459:	learn: 0.3939758	total: 2m 52s	remaining: 6m 29s
460:	learn: 0.3939195	total: 2m 52s	remaining: 6m 29s
461:	learn: 0.3938541	total: 2m 53s	remaining: 6m 28s
462:	learn: 0.3937989	total: 2m 53s	remaining: 6m 28s
463:	learn: 0.3937171	total: 2m 53s	remaining: 6m 27s
464:	learn: 0.3936925	total: 2m 53s	remaining: 6m 27s
465:	learn: 0.3936311	total: 2m 54s	remaining: 6m 26s
466:	learn: 0.3935761	total: 2m 54s	remaining: 6m 26s
467:	learn: 0.3935134	total: 2m 54s	remaining: 6m 25s
468:	learn: 0.3934297	total: 2m 55s	remaining: 6m 25s
469:	learn: 0.3933559	total: 2m 55s	remaining: 6m 25s
470:	learn: 0.3932857	total: 2m 56s	remaining: 6m 24s
471:	learn: 0.3932372	total: 2m 56s	remaining: 6m 24s
472:	learn: 0.3931632	total: 2m 56s	remaining: 6m 23s
473:	learn: 0.3931151	total: 2m 57s	remaining: 6m 23s
474:	learn: 0.3930412	total: 2m 57s	remaining: 6m 22s
475:	learn: 0.3929982	total: 2m 57s	remaining: 6m 22s
476:	learn: 0.3929393	total: 2m 58s	remaining: 6m 21s
477:	learn: 0.3928858	total: 2m 58s	remaining: 6m 21s
478:	learn: 0.3928097	total: 2m 58s	remaining: 6m 21s
479:	learn: 0.3927678	total: 2m 59s	remaining: 6m 20s
480:	learn: 0.3927002	total: 2m 59s	remaining: 6m 20s
481:	learn: 0.3926339	total: 2m 59s	remaining: 6m 20s
482:	learn: 0.3925783	total: 3m	remaining: 6m 19s
483:	learn: 0.3925047	total: 3m	remaining: 6m 19s
484:	learn: 0.3924387	total: 3m	remaining: 6m 18s
485:	learn: 0.3923861	total: 3m 1s	remaining: 6m 18s
486:	learn: 0.3923093	total: 3m 1s	remaining: 6m 18s
487:	learn: 0.3922589	total: 3m 2s	remaining: 6m 17s
488:	learn: 0.3922027	total: 3m 2s	remaining: 6m 17s
489:	learn: 0.3921374	total: 3m 2s	remaining: 6m 17s
490:	learn: 0.3920651	total: 3m 3s	remaining: 6m 16s
491:	learn: 0.3919985	total: 3m 3s	remaining: 6m 16s
492:	learn: 0.3919501	total: 3m 4s	remaining: 6m 16s
493:	learn: 0.3918960	total: 3m 4s	remaining: 6m 15s
494:	learn: 0.3918480	total: 3m 4s	remaining: 6m 15s
495:	learn: 0.3917830	total: 3m 5s	remaining: 6m 14s
496:	learn: 0.3917321	total: 3m 5s	remaining: 6m 14s
497:	learn: 0.3916734	total: 3m 5s	remaining: 6m 13s
498:	learn: 0.3916173	total: 3m 6s	remaining: 6m 13s
499:	learn: 0.3915379	total: 3m 6s	remaining: 6m 12s
500:	learn: 0.3914734	total: 3m 6s	remaining: 6m 12s
501:	learn: 0.3914022	total: 3m 7s	remaining: 6m 12s
502:	learn: 0.3913355	total: 3m 7s	remaining: 6m 11s
503:	learn: 0.3912794	total: 3m 7s	remaining: 6m 11s
504:	learn: 0.3912003	total: 3m 8s	remaining: 6m 11s
505:	learn: 0.3911289	total: 3m 8s	remaining: 6m 10s
506:	learn: 0.3910833	total: 3m 9s	remaining: 6m 10s
507:	learn: 0.3910083	total: 3m 9s	remaining: 6m 9s
508:	learn: 0.3909588	total: 3m 9s	remaining: 6m 9s
509:	learn: 0.3909037	total: 3m 10s	remaining: 6m 8s
510:	learn: 0.3908952	total: 3m 10s	remaining: 6m 8s
511:	learn: 0.3908418	total: 3m 10s	remaining: 6m 7s
512:	learn: 0.3907793	total: 3m 10s	remaining: 6m 7s
513:	learn: 0.3907099	total: 3m 11s	remaining: 6m 7s
514:	learn: 0.3906482	total: 3m 11s	remaining: 6m 6s
515:	learn: 0.3905812	total: 3m 12s	remaining: 6m 6s
516:	learn: 0.3905295	total: 3m 12s	remaining: 6m 6s
517:	learn: 0.3904777	total: 3m 13s	remaining: 6m 5s
518:	learn: 0.3904016	total: 3m 13s	remaining: 6m 5s
519:	learn: 0.3903473	total: 3m 13s	remaining: 6m 5s
520:	learn: 0.3902961	total: 3m 14s	remaining: 6m 5s
521:	learn: 0.3902404	total: 3m 14s	remaining: 6m 4s
522:	learn: 0.3901798	total: 3m 14s	remaining: 6m 4s
523:	learn: 0.3901084	total: 3m 15s	remaining: 6m 3s
524:	learn: 0.3900533	total: 3m 15s	remaining: 6m 3s
525:	learn: 0.3900065	total: 3m 15s	remaining: 6m 2s
526:	learn: 0.3899310	total: 3m 16s	remaining: 6m 2s
527:	learn: 0.3898525	total: 3m 16s	remaining: 6m 2s
528:	learn: 0.3897997	total: 3m 17s	remaining: 6m 1s
529:	learn: 0.3897602	total: 3m 17s	remaining: 6m 1s
530:	learn: 0.3896901	total: 3m 17s	remaining: 6m
531:	learn: 0.3896465	total: 3m 18s	remaining: 6m
532:	learn: 0.3895766	total: 3m 18s	remaining: 5m 59s
533:	learn: 0.3895200	total: 3m 18s	remaining: 5m 59s
534:	learn: 0.3894638	total: 3m 19s	remaining: 5m 59s
535:	learn: 0.3893906	total: 3m 19s	remaining: 5m 58s
536:	learn: 0.3893319	total: 3m 19s	remaining: 5m 58s
537:	learn: 0.3893278	total: 3m 20s	remaining: 5m 57s
538:	learn: 0.3892536	total: 3m 20s	remaining: 5m 57s
539:	learn: 0.3891814	total: 3m 20s	remaining: 5m 57s
540:	learn: 0.3891385	total: 3m 21s	remaining: 5m 56s
541:	learn: 0.3890819	total: 3m 21s	remaining: 5m 56s
542:	learn: 0.3890190	total: 3m 22s	remaining: 5m 56s
543:	learn: 0.3889467	total: 3m 22s	remaining: 5m 55s
544:	learn: 0.3888799	total: 3m 22s	remaining: 5m 55s
545:	learn: 0.3888350	total: 3m 23s	remaining: 5m 54s
546:	learn: 0.3887770	total: 3m 23s	remaining: 5m 54s
547:	learn: 0.3887295	total: 3m 23s	remaining: 5m 54s
548:	learn: 0.3886797	total: 3m 24s	remaining: 5m 53s
549:	learn: 0.3886386	total: 3m 24s	remaining: 5m 53s
550:	learn: 0.3885887	total: 3m 24s	remaining: 5m 52s
551:	learn: 0.3885290	total: 3m 25s	remaining: 5m 52s
552:	learn: 0.3884916	total: 3m 25s	remaining: 5m 51s
553:	learn: 0.3884284	total: 3m 25s	remaining: 5m 51s
554:	learn: 0.3883927	total: 3m 25s	remaining: 5m 50s
555:	learn: 0.3883371	total: 3m 26s	remaining: 5m 50s
556:	learn: 0.3882802	total: 3m 26s	remaining: 5m 49s
557:	learn: 0.3882245	total: 3m 27s	remaining: 5m 49s
558:	learn: 0.3881733	total: 3m 27s	remaining: 5m 49s
559:	learn: 0.3880944	total: 3m 27s	remaining: 5m 48s
560:	learn: 0.3880234	total: 3m 28s	remaining: 5m 48s
561:	learn: 0.3879834	total: 3m 28s	remaining: 5m 47s
562:	learn: 0.3879083	total: 3m 28s	remaining: 5m 47s
563:	learn: 0.3878345	total: 3m 29s	remaining: 5m 47s
564:	learn: 0.3877652	total: 3m 29s	remaining: 5m 46s
565:	learn: 0.3877168	total: 3m 29s	remaining: 5m 46s
566:	learn: 0.3876686	total: 3m 30s	remaining: 5m 45s
567:	learn: 0.3876007	total: 3m 30s	remaining: 5m 45s
568:	learn: 0.3875565	total: 3m 30s	remaining: 5m 44s
569:	learn: 0.3875041	total: 3m 31s	remaining: 5m 44s
570:	learn: 0.3874421	total: 3m 31s	remaining: 5m 43s
571:	learn: 0.3873876	total: 3m 31s	remaining: 5m 43s
572:	learn: 0.3873172	total: 3m 32s	remaining: 5m 43s
573:	learn: 0.3872339	total: 3m 32s	remaining: 5m 43s
574:	learn: 0.3871725	total: 3m 32s	remaining: 5m 42s
575:	learn: 0.3871304	total: 3m 33s	remaining: 5m 42s
576:	learn: 0.3870730	total: 3m 33s	remaining: 5m 41s
577:	learn: 0.3870163	total: 3m 34s	remaining: 5m 41s
578:	learn: 0.3869635	total: 3m 34s	remaining: 5m 41s
579:	learn: 0.3869060	total: 3m 34s	remaining: 5m 40s
580:	learn: 0.3868450	total: 3m 35s	remaining: 5m 40s
581:	learn: 0.3867902	total: 3m 35s	remaining: 5m 40s
582:	learn: 0.3867303	total: 3m 35s	remaining: 5m 39s
583:	learn: 0.3866831	total: 3m 36s	remaining: 5m 39s
584:	learn: 0.3866262	total: 3m 36s	remaining: 5m 38s
585:	learn: 0.3865504	total: 3m 37s	remaining: 5m 38s
586:	learn: 0.3865033	total: 3m 37s	remaining: 5m 38s
587:	learn: 0.3864356	total: 3m 37s	remaining: 5m 37s
588:	learn: 0.3863714	total: 3m 38s	remaining: 5m 37s
589:	learn: 0.3863248	total: 3m 38s	remaining: 5m 36s
590:	learn: 0.3862809	total: 3m 38s	remaining: 5m 36s
591:	learn: 0.3862148	total: 3m 39s	remaining: 5m 36s
592:	learn: 0.3861521	total: 3m 39s	remaining: 5m 35s
593:	learn: 0.3861183	total: 3m 39s	remaining: 5m 35s
594:	learn: 0.3860480	total: 3m 40s	remaining: 5m 34s
595:	learn: 0.3859903	total: 3m 40s	remaining: 5m 34s
596:	learn: 0.3859337	total: 3m 40s	remaining: 5m 33s
597:	learn: 0.3858955	total: 3m 41s	remaining: 5m 33s
598:	learn: 0.3858092	total: 3m 41s	remaining: 5m 33s
599:	learn: 0.3857516	total: 3m 41s	remaining: 5m 32s
600:	learn: 0.3856916	total: 3m 42s	remaining: 5m 32s
601:	learn: 0.3856259	total: 3m 42s	remaining: 5m 31s
602:	learn: 0.3855717	total: 3m 42s	remaining: 5m 31s
603:	learn: 0.3854984	total: 3m 43s	remaining: 5m 31s
604:	learn: 0.3854413	total: 3m 43s	remaining: 5m 30s
605:	learn: 0.3853901	total: 3m 43s	remaining: 5m 30s
606:	learn: 0.3853313	total: 3m 44s	remaining: 5m 29s
607:	learn: 0.3852751	total: 3m 44s	remaining: 5m 29s
608:	learn: 0.3852358	total: 3m 44s	remaining: 5m 29s
609:	learn: 0.3851757	total: 3m 45s	remaining: 5m 28s
610:	learn: 0.3851177	total: 3m 45s	remaining: 5m 28s
611:	learn: 0.3850426	total: 3m 46s	remaining: 5m 27s
612:	learn: 0.3849708	total: 3m 46s	remaining: 5m 27s
613:	learn: 0.3849181	total: 3m 46s	remaining: 5m 27s
614:	learn: 0.3848649	total: 3m 47s	remaining: 5m 26s
615:	learn: 0.3848182	total: 3m 47s	remaining: 5m 26s
616:	learn: 0.3847511	total: 3m 47s	remaining: 5m 25s
617:	learn: 0.3847087	total: 3m 48s	remaining: 5m 25s
618:	learn: 0.3846429	total: 3m 48s	remaining: 5m 25s
619:	learn: 0.3845738	total: 3m 48s	remaining: 5m 24s
620:	learn: 0.3845201	total: 3m 49s	remaining: 5m 24s
621:	learn: 0.3845182	total: 3m 49s	remaining: 5m 23s
622:	learn: 0.3844620	total: 3m 49s	remaining: 5m 23s
623:	learn: 0.3844064	total: 3m 50s	remaining: 5m 22s
624:	learn: 0.3843517	total: 3m 50s	remaining: 5m 22s
625:	learn: 0.3843039	total: 3m 50s	remaining: 5m 22s
626:	learn: 0.3842601	total: 3m 51s	remaining: 5m 21s
627:	learn: 0.3842052	total: 3m 51s	remaining: 5m 21s
628:	learn: 0.3841591	total: 3m 51s	remaining: 5m 21s
629:	learn: 0.3841033	total: 3m 52s	remaining: 5m 20s
630:	learn: 0.3840563	total: 3m 52s	remaining: 5m 20s
631:	learn: 0.3840115	total: 3m 52s	remaining: 5m 19s
632:	learn: 0.3839413	total: 3m 53s	remaining: 5m 19s
633:	learn: 0.3838910	total: 3m 53s	remaining: 5m 19s
634:	learn: 0.3838109	total: 3m 54s	remaining: 5m 18s
635:	learn: 0.3837596	total: 3m 54s	remaining: 5m 18s
636:	learn: 0.3837019	total: 3m 54s	remaining: 5m 18s
637:	learn: 0.3836462	total: 3m 55s	remaining: 5m 17s
638:	learn: 0.3835936	total: 3m 55s	remaining: 5m 17s
639:	learn: 0.3835277	total: 3m 55s	remaining: 5m 16s
640:	learn: 0.3834840	total: 3m 56s	remaining: 5m 16s
641:	learn: 0.3834449	total: 3m 56s	remaining: 5m 16s
642:	learn: 0.3833699	total: 3m 56s	remaining: 5m 15s
643:	learn: 0.3833332	total: 3m 57s	remaining: 5m 15s
644:	learn: 0.3832962	total: 3m 57s	remaining: 5m 14s
645:	learn: 0.3832416	total: 3m 57s	remaining: 5m 14s
646:	learn: 0.3831791	total: 3m 58s	remaining: 5m 13s
647:	learn: 0.3831223	total: 3m 58s	remaining: 5m 13s
648:	learn: 0.3830824	total: 3m 58s	remaining: 5m 13s
649:	learn: 0.3830128	total: 3m 59s	remaining: 5m 12s
650:	learn: 0.3829569	total: 3m 59s	remaining: 5m 12s
651:	learn: 0.3829127	total: 3m 59s	remaining: 5m 11s
652:	learn: 0.3828664	total: 4m	remaining: 5m 11s
653:	learn: 0.3828015	total: 4m	remaining: 5m 10s
654:	learn: 0.3827469	total: 4m	remaining: 5m 10s
655:	learn: 0.3826972	total: 4m 1s	remaining: 5m 10s
656:	learn: 0.3826270	total: 4m 1s	remaining: 5m 9s
657:	learn: 0.3825596	total: 4m 1s	remaining: 5m 9s
658:	learn: 0.3825143	total: 4m 2s	remaining: 5m 9s
659:	learn: 0.3824616	total: 4m 2s	remaining: 5m 8s
660:	learn: 0.3824248	total: 4m 2s	remaining: 5m 8s
661:	learn: 0.3823685	total: 4m 3s	remaining: 5m 7s
662:	learn: 0.3823114	total: 4m 3s	remaining: 5m 7s
663:	learn: 0.3822452	total: 4m 3s	remaining: 5m 7s
664:	learn: 0.3821791	total: 4m 4s	remaining: 5m 6s
665:	learn: 0.3821132	total: 4m 4s	remaining: 5m 6s
666:	learn: 0.3820572	total: 4m 4s	remaining: 5m 5s
667:	learn: 0.3819959	total: 4m 5s	remaining: 5m 5s
668:	learn: 0.3819404	total: 4m 5s	remaining: 5m 5s
669:	learn: 0.3818812	total: 4m 6s	remaining: 5m 4s
670:	learn: 0.3818340	total: 4m 6s	remaining: 5m 4s
671:	learn: 0.3817923	total: 4m 6s	remaining: 5m 4s
672:	learn: 0.3817396	total: 4m 7s	remaining: 5m 3s
673:	learn: 0.3816710	total: 4m 7s	remaining: 5m 3s
674:	learn: 0.3816112	total: 4m 7s	remaining: 5m 3s
675:	learn: 0.3815513	total: 4m 8s	remaining: 5m 2s
676:	learn: 0.3815168	total: 4m 8s	remaining: 5m 2s
677:	learn: 0.3814783	total: 4m 8s	remaining: 5m 1s
678:	learn: 0.3814327	total: 4m 9s	remaining: 5m 1s
679:	learn: 0.3813881	total: 4m 9s	remaining: 5m
680:	learn: 0.3813370	total: 4m 9s	remaining: 5m
681:	learn: 0.3813006	total: 4m 10s	remaining: 5m
682:	learn: 0.3812483	total: 4m 10s	remaining: 4m 59s
683:	learn: 0.3811940	total: 4m 11s	remaining: 4m 59s
684:	learn: 0.3811290	total: 4m 11s	remaining: 4m 59s
685:	learn: 0.3810942	total: 4m 11s	remaining: 4m 58s
686:	learn: 0.3810310	total: 4m 12s	remaining: 4m 58s
687:	learn: 0.3809768	total: 4m 12s	remaining: 4m 57s
688:	learn: 0.3809174	total: 4m 12s	remaining: 4m 57s
689:	learn: 0.3808393	total: 4m 13s	remaining: 4m 57s
690:	learn: 0.3807768	total: 4m 13s	remaining: 4m 56s
691:	learn: 0.3807068	total: 4m 14s	remaining: 4m 56s
692:	learn: 0.3806596	total: 4m 14s	remaining: 4m 56s
693:	learn: 0.3806023	total: 4m 14s	remaining: 4m 55s
694:	learn: 0.3805668	total: 4m 14s	remaining: 4m 55s
695:	learn: 0.3805031	total: 4m 15s	remaining: 4m 54s
696:	learn: 0.3804444	total: 4m 15s	remaining: 4m 54s
697:	learn: 0.3803866	total: 4m 16s	remaining: 4m 54s
698:	learn: 0.3803500	total: 4m 16s	remaining: 4m 53s
699:	learn: 0.3802898	total: 4m 16s	remaining: 4m 53s
700:	learn: 0.3802302	total: 4m 17s	remaining: 4m 53s
701:	learn: 0.3801668	total: 4m 17s	remaining: 4m 52s
702:	learn: 0.3801066	total: 4m 17s	remaining: 4m 52s
703:	learn: 0.3800555	total: 4m 18s	remaining: 4m 51s
704:	learn: 0.3799872	total: 4m 18s	remaining: 4m 51s
705:	learn: 0.3799394	total: 4m 18s	remaining: 4m 51s
706:	learn: 0.3798838	total: 4m 19s	remaining: 4m 50s
707:	learn: 0.3798203	total: 4m 19s	remaining: 4m 50s
708:	learn: 0.3797781	total: 4m 19s	remaining: 4m 49s
709:	learn: 0.3797189	total: 4m 20s	remaining: 4m 49s
710:	learn: 0.3796587	total: 4m 20s	remaining: 4m 49s
711:	learn: 0.3796151	total: 4m 20s	remaining: 4m 48s
712:	learn: 0.3795628	total: 4m 21s	remaining: 4m 48s
713:	learn: 0.3794967	total: 4m 21s	remaining: 4m 47s
714:	learn: 0.3794422	total: 4m 22s	remaining: 4m 47s
715:	learn: 0.3793862	total: 4m 22s	remaining: 4m 47s
716:	learn: 0.3793219	total: 4m 22s	remaining: 4m 46s
717:	learn: 0.3792711	total: 4m 23s	remaining: 4m 46s
718:	learn: 0.3792203	total: 4m 23s	remaining: 4m 46s
719:	learn: 0.3791834	total: 4m 23s	remaining: 4m 45s
720:	learn: 0.3791419	total: 4m 24s	remaining: 4m 45s
721:	learn: 0.3791064	total: 4m 24s	remaining: 4m 45s
722:	learn: 0.3790448	total: 4m 24s	remaining: 4m 44s
723:	learn: 0.3790417	total: 4m 25s	remaining: 4m 44s
724:	learn: 0.3789965	total: 4m 25s	remaining: 4m 43s
725:	learn: 0.3789449	total: 4m 25s	remaining: 4m 43s
726:	learn: 0.3789033	total: 4m 26s	remaining: 4m 42s
727:	learn: 0.3788610	total: 4m 26s	remaining: 4m 42s
728:	learn: 0.3787996	total: 4m 26s	remaining: 4m 42s
729:	learn: 0.3787345	total: 4m 27s	remaining: 4m 41s
730:	learn: 0.3786660	total: 4m 27s	remaining: 4m 41s
731:	learn: 0.3785989	total: 4m 27s	remaining: 4m 41s
732:	learn: 0.3785353	total: 4m 28s	remaining: 4m 40s
733:	learn: 0.3784921	total: 4m 28s	remaining: 4m 40s
734:	learn: 0.3784476	total: 4m 28s	remaining: 4m 39s
735:	learn: 0.3783957	total: 4m 29s	remaining: 4m 39s
736:	learn: 0.3783495	total: 4m 29s	remaining: 4m 39s
737:	learn: 0.3782959	total: 4m 29s	remaining: 4m 38s
738:	learn: 0.3782263	total: 4m 30s	remaining: 4m 38s
739:	learn: 0.3781758	total: 4m 30s	remaining: 4m 37s
740:	learn: 0.3781230	total: 4m 30s	remaining: 4m 37s
741:	learn: 0.3780773	total: 4m 31s	remaining: 4m 37s
742:	learn: 0.3780121	total: 4m 31s	remaining: 4m 36s
743:	learn: 0.3780093	total: 4m 31s	remaining: 4m 36s
744:	learn: 0.3779426	total: 4m 32s	remaining: 4m 35s
745:	learn: 0.3779028	total: 4m 32s	remaining: 4m 35s
746:	learn: 0.3778505	total: 4m 32s	remaining: 4m 35s
747:	learn: 0.3778108	total: 4m 33s	remaining: 4m 34s
748:	learn: 0.3777566	total: 4m 33s	remaining: 4m 34s
749:	learn: 0.3776968	total: 4m 33s	remaining: 4m 33s
750:	learn: 0.3776319	total: 4m 34s	remaining: 4m 33s
751:	learn: 0.3775729	total: 4m 34s	remaining: 4m 33s
752:	learn: 0.3775279	total: 4m 35s	remaining: 4m 32s
753:	learn: 0.3774719	total: 4m 35s	remaining: 4m 32s
754:	learn: 0.3774133	total: 4m 35s	remaining: 4m 32s
755:	learn: 0.3773524	total: 4m 36s	remaining: 4m 31s
756:	learn: 0.3773017	total: 4m 36s	remaining: 4m 31s
757:	learn: 0.3772378	total: 4m 36s	remaining: 4m 30s
758:	learn: 0.3771833	total: 4m 37s	remaining: 4m 30s
759:	learn: 0.3771461	total: 4m 37s	remaining: 4m 30s
760:	learn: 0.3770963	total: 4m 37s	remaining: 4m 29s
761:	learn: 0.3770308	total: 4m 38s	remaining: 4m 29s
762:	learn: 0.3769758	total: 4m 38s	remaining: 4m 29s
763:	learn: 0.3769374	total: 4m 38s	remaining: 4m 28s
764:	learn: 0.3768813	total: 4m 39s	remaining: 4m 28s
765:	learn: 0.3768368	total: 4m 39s	remaining: 4m 28s
766:	learn: 0.3767804	total: 4m 40s	remaining: 4m 27s
767:	learn: 0.3767108	total: 4m 40s	remaining: 4m 27s
768:	learn: 0.3766586	total: 4m 40s	remaining: 4m 26s
769:	learn: 0.3766101	total: 4m 41s	remaining: 4m 26s
770:	learn: 0.3765540	total: 4m 41s	remaining: 4m 26s
771:	learn: 0.3764971	total: 4m 41s	remaining: 4m 25s
772:	learn: 0.3764473	total: 4m 42s	remaining: 4m 25s
773:	learn: 0.3763888	total: 4m 42s	remaining: 4m 24s
774:	learn: 0.3763464	total: 4m 42s	remaining: 4m 24s
775:	learn: 0.3763000	total: 4m 43s	remaining: 4m 24s
776:	learn: 0.3762403	total: 4m 43s	remaining: 4m 23s
777:	learn: 0.3761947	total: 4m 43s	remaining: 4m 23s
778:	learn: 0.3761534	total: 4m 44s	remaining: 4m 22s
779:	learn: 0.3760900	total: 4m 44s	remaining: 4m 22s
780:	learn: 0.3760461	total: 4m 44s	remaining: 4m 22s
781:	learn: 0.3759906	total: 4m 45s	remaining: 4m 21s
782:	learn: 0.3759430	total: 4m 45s	remaining: 4m 21s
783:	learn: 0.3758741	total: 4m 46s	remaining: 4m 21s
784:	learn: 0.3758292	total: 4m 46s	remaining: 4m 20s
785:	learn: 0.3757746	total: 4m 46s	remaining: 4m 20s
786:	learn: 0.3757299	total: 4m 47s	remaining: 4m 20s
787:	learn: 0.3756724	total: 4m 47s	remaining: 4m 20s
788:	learn: 0.3756199	total: 4m 48s	remaining: 4m 19s
789:	learn: 0.3755754	total: 4m 48s	remaining: 4m 19s
790:	learn: 0.3755271	total: 4m 48s	remaining: 4m 18s
791:	learn: 0.3754599	total: 4m 49s	remaining: 4m 18s
792:	learn: 0.3754021	total: 4m 49s	remaining: 4m 18s
793:	learn: 0.3753562	total: 4m 50s	remaining: 4m 18s
794:	learn: 0.3753057	total: 4m 50s	remaining: 4m 17s
795:	learn: 0.3752642	total: 4m 50s	remaining: 4m 17s
796:	learn: 0.3752163	total: 4m 51s	remaining: 4m 16s
797:	learn: 0.3751644	total: 4m 51s	remaining: 4m 16s
798:	learn: 0.3751183	total: 4m 51s	remaining: 4m 16s
799:	learn: 0.3750748	total: 4m 52s	remaining: 4m 15s
800:	learn: 0.3750133	total: 4m 52s	remaining: 4m 15s
801:	learn: 0.3749578	total: 4m 52s	remaining: 4m 14s
802:	learn: 0.3749093	total: 4m 53s	remaining: 4m 14s
803:	learn: 0.3748603	total: 4m 53s	remaining: 4m 14s
804:	learn: 0.3748332	total: 4m 53s	remaining: 4m 13s
805:	learn: 0.3747875	total: 4m 54s	remaining: 4m 13s
806:	learn: 0.3747217	total: 4m 54s	remaining: 4m 13s
807:	learn: 0.3746632	total: 4m 55s	remaining: 4m 12s
808:	learn: 0.3746252	total: 4m 55s	remaining: 4m 12s
809:	learn: 0.3745750	total: 4m 55s	remaining: 4m 12s
810:	learn: 0.3745206	total: 4m 56s	remaining: 4m 11s
811:	learn: 0.3744703	total: 4m 56s	remaining: 4m 11s
812:	learn: 0.3744320	total: 4m 57s	remaining: 4m 10s
813:	learn: 0.3743681	total: 4m 57s	remaining: 4m 10s
814:	learn: 0.3743068	total: 4m 57s	remaining: 4m 10s
815:	learn: 0.3742460	total: 4m 58s	remaining: 4m 9s
816:	learn: 0.3742089	total: 4m 58s	remaining: 4m 9s
817:	learn: 0.3741524	total: 4m 58s	remaining: 4m 9s
818:	learn: 0.3740909	total: 4m 59s	remaining: 4m 8s
819:	learn: 0.3740313	total: 4m 59s	remaining: 4m 8s
820:	learn: 0.3739665	total: 4m 59s	remaining: 4m 8s
821:	learn: 0.3739102	total: 5m	remaining: 4m 7s
822:	learn: 0.3738525	total: 5m	remaining: 4m 7s
823:	learn: 0.3738102	total: 5m	remaining: 4m 6s
824:	learn: 0.3737647	total: 5m 1s	remaining: 4m 6s
825:	learn: 0.3737153	total: 5m 1s	remaining: 4m 6s
826:	learn: 0.3736685	total: 5m 1s	remaining: 4m 5s
827:	learn: 0.3736277	total: 5m 2s	remaining: 4m 5s
828:	learn: 0.3735838	total: 5m 2s	remaining: 4m 4s
829:	learn: 0.3735271	total: 5m 2s	remaining: 4m 4s
830:	learn: 0.3734687	total: 5m 3s	remaining: 4m 4s
831:	learn: 0.3734281	total: 5m 3s	remaining: 4m 3s
832:	learn: 0.3733679	total: 5m 3s	remaining: 4m 3s
833:	learn: 0.3733177	total: 5m 4s	remaining: 4m 2s
834:	learn: 0.3732661	total: 5m 4s	remaining: 4m 2s
835:	learn: 0.3732331	total: 5m 4s	remaining: 4m 2s
836:	learn: 0.3731953	total: 5m 5s	remaining: 4m 1s
837:	learn: 0.3731340	total: 5m 5s	remaining: 4m 1s
838:	learn: 0.3730946	total: 5m 6s	remaining: 4m 1s
839:	learn: 0.3730480	total: 5m 6s	remaining: 4m
840:	learn: 0.3729987	total: 5m 6s	remaining: 4m
841:	learn: 0.3729394	total: 5m 7s	remaining: 3m 59s
842:	learn: 0.3728751	total: 5m 7s	remaining: 3m 59s
843:	learn: 0.3728254	total: 5m 7s	remaining: 3m 59s
844:	learn: 0.3727648	total: 5m 8s	remaining: 3m 58s
845:	learn: 0.3727630	total: 5m 8s	remaining: 3m 58s
846:	learn: 0.3727106	total: 5m 8s	remaining: 3m 58s
847:	learn: 0.3726684	total: 5m 9s	remaining: 3m 57s
848:	learn: 0.3726304	total: 5m 9s	remaining: 3m 57s
849:	learn: 0.3725779	total: 5m 9s	remaining: 3m 56s
850:	learn: 0.3725281	total: 5m 10s	remaining: 3m 56s
851:	learn: 0.3724803	total: 5m 10s	remaining: 3m 56s
852:	learn: 0.3724104	total: 5m 11s	remaining: 3m 56s
853:	learn: 0.3723587	total: 5m 11s	remaining: 3m 55s
854:	learn: 0.3723188	total: 5m 11s	remaining: 3m 55s
855:	learn: 0.3722659	total: 5m 12s	remaining: 3m 54s
856:	learn: 0.3722258	total: 5m 12s	remaining: 3m 54s
857:	learn: 0.3721726	total: 5m 13s	remaining: 3m 54s
858:	learn: 0.3721189	total: 5m 13s	remaining: 3m 54s
859:	learn: 0.3720742	total: 5m 13s	remaining: 3m 53s
860:	learn: 0.3720117	total: 5m 14s	remaining: 3m 53s
861:	learn: 0.3719562	total: 5m 14s	remaining: 3m 52s
862:	learn: 0.3718969	total: 5m 15s	remaining: 3m 52s
863:	learn: 0.3718464	total: 5m 15s	remaining: 3m 52s
864:	learn: 0.3718036	total: 5m 15s	remaining: 3m 51s
865:	learn: 0.3717538	total: 5m 16s	remaining: 3m 51s
866:	learn: 0.3717009	total: 5m 16s	remaining: 3m 51s
867:	learn: 0.3716466	total: 5m 17s	remaining: 3m 50s
868:	learn: 0.3715879	total: 5m 17s	remaining: 3m 50s
869:	learn: 0.3715429	total: 5m 17s	remaining: 3m 50s
870:	learn: 0.3714830	total: 5m 18s	remaining: 3m 49s
871:	learn: 0.3714374	total: 5m 18s	remaining: 3m 49s
872:	learn: 0.3713803	total: 5m 18s	remaining: 3m 49s
873:	learn: 0.3713383	total: 5m 19s	remaining: 3m 48s
874:	learn: 0.3713022	total: 5m 19s	remaining: 3m 48s
875:	learn: 0.3712550	total: 5m 19s	remaining: 3m 47s
876:	learn: 0.3712075	total: 5m 20s	remaining: 3m 47s
877:	learn: 0.3711533	total: 5m 20s	remaining: 3m 46s
878:	learn: 0.3710924	total: 5m 20s	remaining: 3m 46s
879:	learn: 0.3710415	total: 5m 21s	remaining: 3m 46s
880:	learn: 0.3709948	total: 5m 21s	remaining: 3m 45s
881:	learn: 0.3709316	total: 5m 21s	remaining: 3m 45s
882:	learn: 0.3708800	total: 5m 22s	remaining: 3m 45s
883:	learn: 0.3708203	total: 5m 22s	remaining: 3m 44s
884:	learn: 0.3707745	total: 5m 22s	remaining: 3m 44s
885:	learn: 0.3707323	total: 5m 23s	remaining: 3m 43s
886:	learn: 0.3706834	total: 5m 23s	remaining: 3m 43s
887:	learn: 0.3706374	total: 5m 23s	remaining: 3m 43s
888:	learn: 0.3705868	total: 5m 24s	remaining: 3m 42s
889:	learn: 0.3705454	total: 5m 24s	remaining: 3m 42s
890:	learn: 0.3704959	total: 5m 24s	remaining: 3m 42s
891:	learn: 0.3704503	total: 5m 25s	remaining: 3m 41s
892:	learn: 0.3703907	total: 5m 25s	remaining: 3m 41s
893:	learn: 0.3703491	total: 5m 26s	remaining: 3m 40s
894:	learn: 0.3702863	total: 5m 26s	remaining: 3m 40s
895:	learn: 0.3702288	total: 5m 26s	remaining: 3m 40s
896:	learn: 0.3701890	total: 5m 27s	remaining: 3m 39s
897:	learn: 0.3701400	total: 5m 27s	remaining: 3m 39s
898:	learn: 0.3700924	total: 5m 27s	remaining: 3m 39s
899:	learn: 0.3700443	total: 5m 28s	remaining: 3m 38s
900:	learn: 0.3700111	total: 5m 28s	remaining: 3m 38s
901:	learn: 0.3699815	total: 5m 28s	remaining: 3m 38s
902:	learn: 0.3699261	total: 5m 29s	remaining: 3m 37s
903:	learn: 0.3698714	total: 5m 29s	remaining: 3m 37s
904:	learn: 0.3698277	total: 5m 30s	remaining: 3m 37s
905:	learn: 0.3697620	total: 5m 30s	remaining: 3m 36s
906:	learn: 0.3696990	total: 5m 30s	remaining: 3m 36s
907:	learn: 0.3696590	total: 5m 31s	remaining: 3m 35s
908:	learn: 0.3695914	total: 5m 31s	remaining: 3m 35s
909:	learn: 0.3695452	total: 5m 32s	remaining: 3m 35s
910:	learn: 0.3694886	total: 5m 32s	remaining: 3m 34s
911:	learn: 0.3694336	total: 5m 32s	remaining: 3m 34s
912:	learn: 0.3693878	total: 5m 33s	remaining: 3m 34s
913:	learn: 0.3693366	total: 5m 33s	remaining: 3m 33s
914:	learn: 0.3692890	total: 5m 33s	remaining: 3m 33s
915:	learn: 0.3692470	total: 5m 34s	remaining: 3m 33s
916:	learn: 0.3692011	total: 5m 34s	remaining: 3m 32s
917:	learn: 0.3691516	total: 5m 34s	remaining: 3m 32s
918:	learn: 0.3690953	total: 5m 35s	remaining: 3m 31s
919:	learn: 0.3690537	total: 5m 35s	remaining: 3m 31s
920:	learn: 0.3689911	total: 5m 36s	remaining: 3m 31s
921:	learn: 0.3689333	total: 5m 36s	remaining: 3m 30s
922:	learn: 0.3688838	total: 5m 36s	remaining: 3m 30s
923:	learn: 0.3688307	total: 5m 37s	remaining: 3m 30s
924:	learn: 0.3687781	total: 5m 37s	remaining: 3m 29s
925:	learn: 0.3687256	total: 5m 37s	remaining: 3m 29s
926:	learn: 0.3686854	total: 5m 38s	remaining: 3m 29s
927:	learn: 0.3686365	total: 5m 38s	remaining: 3m 28s
928:	learn: 0.3685766	total: 5m 39s	remaining: 3m 28s
929:	learn: 0.3685128	total: 5m 39s	remaining: 3m 28s
930:	learn: 0.3684588	total: 5m 39s	remaining: 3m 27s
931:	learn: 0.3684119	total: 5m 40s	remaining: 3m 27s
932:	learn: 0.3683651	total: 5m 40s	remaining: 3m 26s
933:	learn: 0.3682940	total: 5m 40s	remaining: 3m 26s
934:	learn: 0.3682240	total: 5m 41s	remaining: 3m 26s
935:	learn: 0.3681605	total: 5m 41s	remaining: 3m 25s
936:	learn: 0.3681042	total: 5m 42s	remaining: 3m 25s
937:	learn: 0.3680487	total: 5m 42s	remaining: 3m 25s
938:	learn: 0.3679906	total: 5m 42s	remaining: 3m 24s
939:	learn: 0.3679348	total: 5m 43s	remaining: 3m 24s
940:	learn: 0.3679092	total: 5m 43s	remaining: 3m 24s
941:	learn: 0.3678529	total: 5m 44s	remaining: 3m 23s
942:	learn: 0.3678049	total: 5m 44s	remaining: 3m 23s
943:	learn: 0.3677631	total: 5m 44s	remaining: 3m 23s
944:	learn: 0.3677149	total: 5m 45s	remaining: 3m 22s
945:	learn: 0.3676506	total: 5m 45s	remaining: 3m 22s
946:	learn: 0.3675944	total: 5m 45s	remaining: 3m 22s
947:	learn: 0.3675453	total: 5m 46s	remaining: 3m 21s
948:	learn: 0.3674902	total: 5m 46s	remaining: 3m 21s
949:	learn: 0.3674340	total: 5m 47s	remaining: 3m 20s
950:	learn: 0.3673818	total: 5m 47s	remaining: 3m 20s
951:	learn: 0.3673418	total: 5m 47s	remaining: 3m 20s
952:	learn: 0.3672949	total: 5m 48s	remaining: 3m 19s
953:	learn: 0.3672515	total: 5m 48s	remaining: 3m 19s
954:	learn: 0.3671929	total: 5m 48s	remaining: 3m 19s
955:	learn: 0.3671373	total: 5m 49s	remaining: 3m 18s
956:	learn: 0.3670873	total: 5m 49s	remaining: 3m 18s
957:	learn: 0.3670427	total: 5m 50s	remaining: 3m 18s
958:	learn: 0.3669923	total: 5m 50s	remaining: 3m 17s
959:	learn: 0.3669389	total: 5m 50s	remaining: 3m 17s
960:	learn: 0.3668952	total: 5m 51s	remaining: 3m 16s
961:	learn: 0.3668352	total: 5m 51s	remaining: 3m 16s
962:	learn: 0.3667979	total: 5m 51s	remaining: 3m 16s
963:	learn: 0.3667619	total: 5m 52s	remaining: 3m 15s
964:	learn: 0.3667153	total: 5m 52s	remaining: 3m 15s
965:	learn: 0.3666552	total: 5m 53s	remaining: 3m 15s
966:	learn: 0.3666175	total: 5m 53s	remaining: 3m 14s
967:	learn: 0.3665791	total: 5m 53s	remaining: 3m 14s
968:	learn: 0.3665220	total: 5m 54s	remaining: 3m 14s
969:	learn: 0.3664593	total: 5m 54s	remaining: 3m 13s
970:	learn: 0.3664069	total: 5m 54s	remaining: 3m 13s
971:	learn: 0.3663636	total: 5m 55s	remaining: 3m 12s
972:	learn: 0.3663114	total: 5m 55s	remaining: 3m 12s
973:	learn: 0.3662554	total: 5m 55s	remaining: 3m 12s
974:	learn: 0.3662178	total: 5m 56s	remaining: 3m 11s
975:	learn: 0.3661783	total: 5m 56s	remaining: 3m 11s
976:	learn: 0.3661255	total: 5m 56s	remaining: 3m 11s
977:	learn: 0.3660643	total: 5m 57s	remaining: 3m 10s
978:	learn: 0.3660155	total: 5m 57s	remaining: 3m 10s
979:	learn: 0.3659663	total: 5m 58s	remaining: 3m 10s
980:	learn: 0.3659109	total: 5m 58s	remaining: 3m 9s
981:	learn: 0.3658700	total: 5m 58s	remaining: 3m 9s
982:	learn: 0.3658533	total: 5m 59s	remaining: 3m 8s
983:	learn: 0.3658010	total: 5m 59s	remaining: 3m 8s
984:	learn: 0.3657526	total: 6m	remaining: 3m 8s
985:	learn: 0.3657005	total: 6m	remaining: 3m 7s
986:	learn: 0.3656546	total: 6m	remaining: 3m 7s
987:	learn: 0.3656185	total: 6m 1s	remaining: 3m 7s
988:	learn: 0.3655530	total: 6m 1s	remaining: 3m 6s
989:	learn: 0.3655070	total: 6m 1s	remaining: 3m 6s
990:	learn: 0.3654515	total: 6m 2s	remaining: 3m 5s
991:	learn: 0.3654067	total: 6m 2s	remaining: 3m 5s
992:	learn: 0.3653630	total: 6m 2s	remaining: 3m 5s
993:	learn: 0.3653263	total: 6m 3s	remaining: 3m 4s
994:	learn: 0.3652741	total: 6m 3s	remaining: 3m 4s
995:	learn: 0.3652404	total: 6m 3s	remaining: 3m 4s
996:	learn: 0.3651939	total: 6m 4s	remaining: 3m 3s
997:	learn: 0.3651495	total: 6m 4s	remaining: 3m 3s
998:	learn: 0.3650875	total: 6m 4s	remaining: 3m 3s
999:	learn: 0.3650420	total: 6m 5s	remaining: 3m 2s
1000:	learn: 0.3649990	total: 6m 5s	remaining: 3m 2s
1001:	learn: 0.3649372	total: 6m 5s	remaining: 3m 1s
1002:	learn: 0.3648972	total: 6m 6s	remaining: 3m 1s
1003:	learn: 0.3648488	total: 6m 6s	remaining: 3m 1s
1004:	learn: 0.3647951	total: 6m 7s	remaining: 3m
1005:	learn: 0.3647473	total: 6m 7s	remaining: 3m
1006:	learn: 0.3646994	total: 6m 7s	remaining: 3m
1007:	learn: 0.3646463	total: 6m 8s	remaining: 2m 59s
1008:	learn: 0.3646013	total: 6m 8s	remaining: 2m 59s
1009:	learn: 0.3645521	total: 6m 8s	remaining: 2m 58s
1010:	learn: 0.3644950	total: 6m 9s	remaining: 2m 58s
1011:	learn: 0.3644346	total: 6m 9s	remaining: 2m 58s
1012:	learn: 0.3643739	total: 6m 10s	remaining: 2m 57s
1013:	learn: 0.3643263	total: 6m 10s	remaining: 2m 57s
1014:	learn: 0.3642700	total: 6m 10s	remaining: 2m 57s
1015:	learn: 0.3642263	total: 6m 11s	remaining: 2m 56s
1016:	learn: 0.3641934	total: 6m 11s	remaining: 2m 56s
1017:	learn: 0.3641496	total: 6m 11s	remaining: 2m 56s
1018:	learn: 0.3641487	total: 6m 12s	remaining: 2m 55s
1019:	learn: 0.3641104	total: 6m 12s	remaining: 2m 55s
1020:	learn: 0.3640724	total: 6m 12s	remaining: 2m 54s
1021:	learn: 0.3640248	total: 6m 13s	remaining: 2m 54s
1022:	learn: 0.3639743	total: 6m 13s	remaining: 2m 54s
1023:	learn: 0.3639341	total: 6m 13s	remaining: 2m 53s
1024:	learn: 0.3638962	total: 6m 14s	remaining: 2m 53s
1025:	learn: 0.3638446	total: 6m 14s	remaining: 2m 53s
1026:	learn: 0.3638136	total: 6m 14s	remaining: 2m 52s
1027:	learn: 0.3637652	total: 6m 15s	remaining: 2m 52s
1028:	learn: 0.3637230	total: 6m 15s	remaining: 2m 52s
1029:	learn: 0.3636705	total: 6m 16s	remaining: 2m 51s
1030:	learn: 0.3636358	total: 6m 16s	remaining: 2m 51s
1031:	learn: 0.3635858	total: 6m 16s	remaining: 2m 50s
1032:	learn: 0.3635447	total: 6m 17s	remaining: 2m 50s
1033:	learn: 0.3635016	total: 6m 17s	remaining: 2m 50s
1034:	learn: 0.3634588	total: 6m 18s	remaining: 2m 49s
1035:	learn: 0.3634068	total: 6m 18s	remaining: 2m 49s
1036:	learn: 0.3633765	total: 6m 18s	remaining: 2m 49s
1037:	learn: 0.3633142	total: 6m 18s	remaining: 2m 48s
1038:	learn: 0.3632733	total: 6m 19s	remaining: 2m 48s
1039:	learn: 0.3632438	total: 6m 19s	remaining: 2m 47s
1040:	learn: 0.3632017	total: 6m 19s	remaining: 2m 47s
1041:	learn: 0.3631592	total: 6m 20s	remaining: 2m 47s
1042:	learn: 0.3631164	total: 6m 20s	remaining: 2m 46s
1043:	learn: 0.3630730	total: 6m 20s	remaining: 2m 46s
1044:	learn: 0.3630251	total: 6m 21s	remaining: 2m 45s
1045:	learn: 0.3629836	total: 6m 21s	remaining: 2m 45s
1046:	learn: 0.3629354	total: 6m 21s	remaining: 2m 45s
1047:	learn: 0.3629055	total: 6m 22s	remaining: 2m 44s
1048:	learn: 0.3628600	total: 6m 22s	remaining: 2m 44s
1049:	learn: 0.3628121	total: 6m 22s	remaining: 2m 44s
1050:	learn: 0.3627581	total: 6m 23s	remaining: 2m 43s
1051:	learn: 0.3627031	total: 6m 23s	remaining: 2m 43s
1052:	learn: 0.3626445	total: 6m 23s	remaining: 2m 42s
1053:	learn: 0.3625884	total: 6m 24s	remaining: 2m 42s
1054:	learn: 0.3625354	total: 6m 24s	remaining: 2m 42s
1055:	learn: 0.3624759	total: 6m 24s	remaining: 2m 41s
1056:	learn: 0.3624216	total: 6m 25s	remaining: 2m 41s
1057:	learn: 0.3623592	total: 6m 25s	remaining: 2m 41s
1058:	learn: 0.3623195	total: 6m 25s	remaining: 2m 40s
1059:	learn: 0.3622762	total: 6m 26s	remaining: 2m 40s
1060:	learn: 0.3622295	total: 6m 26s	remaining: 2m 39s
1061:	learn: 0.3621858	total: 6m 26s	remaining: 2m 39s
1062:	learn: 0.3621363	total: 6m 27s	remaining: 2m 39s
1063:	learn: 0.3620790	total: 6m 27s	remaining: 2m 38s
1064:	learn: 0.3620219	total: 6m 28s	remaining: 2m 38s
1065:	learn: 0.3619752	total: 6m 28s	remaining: 2m 38s
1066:	learn: 0.3619285	total: 6m 28s	remaining: 2m 37s
1067:	learn: 0.3618956	total: 6m 29s	remaining: 2m 37s
1068:	learn: 0.3618560	total: 6m 29s	remaining: 2m 36s
1069:	learn: 0.3618090	total: 6m 29s	remaining: 2m 36s
1070:	learn: 0.3617701	total: 6m 30s	remaining: 2m 36s
1071:	learn: 0.3617367	total: 6m 30s	remaining: 2m 35s
1072:	learn: 0.3616912	total: 6m 30s	remaining: 2m 35s
1073:	learn: 0.3616900	total: 6m 31s	remaining: 2m 35s
1074:	learn: 0.3616350	total: 6m 31s	remaining: 2m 34s
1075:	learn: 0.3615742	total: 6m 31s	remaining: 2m 34s
1076:	learn: 0.3615247	total: 6m 32s	remaining: 2m 34s
1077:	learn: 0.3614813	total: 6m 32s	remaining: 2m 33s
1078:	learn: 0.3614332	total: 6m 32s	remaining: 2m 33s
1079:	learn: 0.3613766	total: 6m 33s	remaining: 2m 32s
1080:	learn: 0.3613438	total: 6m 33s	remaining: 2m 32s
1081:	learn: 0.3613033	total: 6m 33s	remaining: 2m 32s
1082:	learn: 0.3612531	total: 6m 34s	remaining: 2m 31s
1083:	learn: 0.3612012	total: 6m 34s	remaining: 2m 31s
1084:	learn: 0.3611513	total: 6m 35s	remaining: 2m 31s
1085:	learn: 0.3611125	total: 6m 35s	remaining: 2m 30s
1086:	learn: 0.3610604	total: 6m 35s	remaining: 2m 30s
1087:	learn: 0.3610281	total: 6m 35s	remaining: 2m 29s
1088:	learn: 0.3609723	total: 6m 36s	remaining: 2m 29s
1089:	learn: 0.3609331	total: 6m 36s	remaining: 2m 29s
1090:	learn: 0.3608921	total: 6m 37s	remaining: 2m 28s
1091:	learn: 0.3608408	total: 6m 37s	remaining: 2m 28s
1092:	learn: 0.3608086	total: 6m 37s	remaining: 2m 28s
1093:	learn: 0.3607514	total: 6m 38s	remaining: 2m 27s
1094:	learn: 0.3607157	total: 6m 38s	remaining: 2m 27s
1095:	learn: 0.3606732	total: 6m 38s	remaining: 2m 26s
1096:	learn: 0.3606298	total: 6m 38s	remaining: 2m 26s
1097:	learn: 0.3605898	total: 6m 39s	remaining: 2m 26s
1098:	learn: 0.3605532	total: 6m 39s	remaining: 2m 25s
1099:	learn: 0.3605091	total: 6m 39s	remaining: 2m 25s
1100:	learn: 0.3604690	total: 6m 40s	remaining: 2m 25s
1101:	learn: 0.3604309	total: 6m 40s	remaining: 2m 24s
1102:	learn: 0.3603907	total: 6m 40s	remaining: 2m 24s
1103:	learn: 0.3603567	total: 6m 41s	remaining: 2m 23s
1104:	learn: 0.3603060	total: 6m 41s	remaining: 2m 23s
1105:	learn: 0.3602557	total: 6m 41s	remaining: 2m 23s
1106:	learn: 0.3602202	total: 6m 42s	remaining: 2m 22s
1107:	learn: 0.3601739	total: 6m 42s	remaining: 2m 22s
1108:	learn: 0.3601038	total: 6m 42s	remaining: 2m 22s
1109:	learn: 0.3600588	total: 6m 43s	remaining: 2m 21s
1110:	learn: 0.3600084	total: 6m 43s	remaining: 2m 21s
1111:	learn: 0.3599668	total: 6m 43s	remaining: 2m 20s
1112:	learn: 0.3599178	total: 6m 44s	remaining: 2m 20s
1113:	learn: 0.3598719	total: 6m 44s	remaining: 2m 20s
1114:	learn: 0.3598281	total: 6m 44s	remaining: 2m 19s
1115:	learn: 0.3597902	total: 6m 45s	remaining: 2m 19s
1116:	learn: 0.3597367	total: 6m 45s	remaining: 2m 19s
1117:	learn: 0.3596809	total: 6m 46s	remaining: 2m 18s
1118:	learn: 0.3596225	total: 6m 46s	remaining: 2m 18s
1119:	learn: 0.3595822	total: 6m 46s	remaining: 2m 18s
1120:	learn: 0.3595317	total: 6m 47s	remaining: 2m 17s
1121:	learn: 0.3594824	total: 6m 47s	remaining: 2m 17s
1122:	learn: 0.3594330	total: 6m 48s	remaining: 2m 16s
1123:	learn: 0.3593841	total: 6m 48s	remaining: 2m 16s
1124:	learn: 0.3593400	total: 6m 48s	remaining: 2m 16s
1125:	learn: 0.3592896	total: 6m 49s	remaining: 2m 15s
1126:	learn: 0.3592328	total: 6m 49s	remaining: 2m 15s
1127:	learn: 0.3591937	total: 6m 49s	remaining: 2m 15s
1128:	learn: 0.3591392	total: 6m 50s	remaining: 2m 14s
1129:	learn: 0.3590847	total: 6m 50s	remaining: 2m 14s
1130:	learn: 0.3590296	total: 6m 50s	remaining: 2m 14s
1131:	learn: 0.3589862	total: 6m 51s	remaining: 2m 13s
1132:	learn: 0.3589423	total: 6m 51s	remaining: 2m 13s
1133:	learn: 0.3589009	total: 6m 51s	remaining: 2m 12s
1134:	learn: 0.3588617	total: 6m 52s	remaining: 2m 12s
1135:	learn: 0.3587986	total: 6m 52s	remaining: 2m 12s
1136:	learn: 0.3587423	total: 6m 52s	remaining: 2m 11s
1137:	learn: 0.3586921	total: 6m 53s	remaining: 2m 11s
1138:	learn: 0.3586409	total: 6m 53s	remaining: 2m 11s
1139:	learn: 0.3585816	total: 6m 54s	remaining: 2m 10s
1140:	learn: 0.3585393	total: 6m 54s	remaining: 2m 10s
1141:	learn: 0.3585030	total: 6m 54s	remaining: 2m 10s
1142:	learn: 0.3584644	total: 6m 55s	remaining: 2m 9s
1143:	learn: 0.3584145	total: 6m 55s	remaining: 2m 9s
1144:	learn: 0.3583688	total: 6m 55s	remaining: 2m 8s
1145:	learn: 0.3583218	total: 6m 56s	remaining: 2m 8s
1146:	learn: 0.3582856	total: 6m 56s	remaining: 2m 8s
1147:	learn: 0.3582272	total: 6m 57s	remaining: 2m 7s
1148:	learn: 0.3581768	total: 6m 57s	remaining: 2m 7s
1149:	learn: 0.3581187	total: 6m 57s	remaining: 2m 7s
1150:	learn: 0.3580720	total: 6m 58s	remaining: 2m 6s
1151:	learn: 0.3580364	total: 6m 58s	remaining: 2m 6s
1152:	learn: 0.3579986	total: 6m 59s	remaining: 2m 6s
1153:	learn: 0.3579676	total: 6m 59s	remaining: 2m 5s
1154:	learn: 0.3579152	total: 6m 59s	remaining: 2m 5s
1155:	learn: 0.3578672	total: 7m	remaining: 2m 4s
1156:	learn: 0.3578172	total: 7m	remaining: 2m 4s
1157:	learn: 0.3577658	total: 7m	remaining: 2m 4s
1158:	learn: 0.3577205	total: 7m 1s	remaining: 2m 3s
1159:	learn: 0.3576771	total: 7m 1s	remaining: 2m 3s
1160:	learn: 0.3576314	total: 7m 1s	remaining: 2m 3s
1161:	learn: 0.3575893	total: 7m 2s	remaining: 2m 2s
1162:	learn: 0.3575373	total: 7m 2s	remaining: 2m 2s
1163:	learn: 0.3574963	total: 7m 2s	remaining: 2m 2s
1164:	learn: 0.3574506	total: 7m 3s	remaining: 2m 1s
1165:	learn: 0.3574094	total: 7m 3s	remaining: 2m 1s
1166:	learn: 0.3573646	total: 7m 4s	remaining: 2m 1s
1167:	learn: 0.3573062	total: 7m 4s	remaining: 2m
1168:	learn: 0.3572498	total: 7m 4s	remaining: 2m
1169:	learn: 0.3572145	total: 7m 5s	remaining: 1m 59s
1170:	learn: 0.3571522	total: 7m 5s	remaining: 1m 59s
1171:	learn: 0.3571075	total: 7m 5s	remaining: 1m 59s
1172:	learn: 0.3570566	total: 7m 6s	remaining: 1m 58s
1173:	learn: 0.3570099	total: 7m 6s	remaining: 1m 58s
1174:	learn: 0.3569669	total: 7m 7s	remaining: 1m 58s
1175:	learn: 0.3569471	total: 7m 7s	remaining: 1m 57s
1176:	learn: 0.3568886	total: 7m 7s	remaining: 1m 57s
1177:	learn: 0.3568462	total: 7m 8s	remaining: 1m 57s
1178:	learn: 0.3567962	total: 7m 8s	remaining: 1m 56s
1179:	learn: 0.3567383	total: 7m 8s	remaining: 1m 56s
1180:	learn: 0.3567053	total: 7m 9s	remaining: 1m 55s
1181:	learn: 0.3566658	total: 7m 9s	remaining: 1m 55s
1182:	learn: 0.3566261	total: 7m 9s	remaining: 1m 55s
1183:	learn: 0.3565772	total: 7m 10s	remaining: 1m 54s
1184:	learn: 0.3565269	total: 7m 10s	remaining: 1m 54s
1185:	learn: 0.3564822	total: 7m 10s	remaining: 1m 54s
1186:	learn: 0.3564221	total: 7m 11s	remaining: 1m 53s
1187:	learn: 0.3563865	total: 7m 11s	remaining: 1m 53s
1188:	learn: 0.3563506	total: 7m 12s	remaining: 1m 53s
1189:	learn: 0.3562974	total: 7m 12s	remaining: 1m 52s
1190:	learn: 0.3562611	total: 7m 13s	remaining: 1m 52s
1191:	learn: 0.3562082	total: 7m 13s	remaining: 1m 52s
1192:	learn: 0.3561730	total: 7m 13s	remaining: 1m 51s
1193:	learn: 0.3561334	total: 7m 14s	remaining: 1m 51s
1194:	learn: 0.3560838	total: 7m 14s	remaining: 1m 50s
1195:	learn: 0.3560399	total: 7m 14s	remaining: 1m 50s
1196:	learn: 0.3560054	total: 7m 15s	remaining: 1m 50s
1197:	learn: 0.3559763	total: 7m 15s	remaining: 1m 49s
1198:	learn: 0.3559289	total: 7m 15s	remaining: 1m 49s
1199:	learn: 0.3558693	total: 7m 16s	remaining: 1m 49s
1200:	learn: 0.3558084	total: 7m 16s	remaining: 1m 48s
1201:	learn: 0.3557714	total: 7m 17s	remaining: 1m 48s
1202:	learn: 0.3557138	total: 7m 17s	remaining: 1m 48s
1203:	learn: 0.3556639	total: 7m 17s	remaining: 1m 47s
1204:	learn: 0.3556291	total: 7m 18s	remaining: 1m 47s
1205:	learn: 0.3555727	total: 7m 18s	remaining: 1m 46s
1206:	learn: 0.3555298	total: 7m 19s	remaining: 1m 46s
1207:	learn: 0.3554912	total: 7m 19s	remaining: 1m 46s
1208:	learn: 0.3554370	total: 7m 19s	remaining: 1m 45s
1209:	learn: 0.3553847	total: 7m 20s	remaining: 1m 45s
1210:	learn: 0.3553388	total: 7m 20s	remaining: 1m 45s
1211:	learn: 0.3552875	total: 7m 20s	remaining: 1m 44s
1212:	learn: 0.3552532	total: 7m 21s	remaining: 1m 44s
1213:	learn: 0.3552186	total: 7m 21s	remaining: 1m 44s
1214:	learn: 0.3551783	total: 7m 21s	remaining: 1m 43s
1215:	learn: 0.3551475	total: 7m 22s	remaining: 1m 43s
1216:	learn: 0.3551041	total: 7m 22s	remaining: 1m 42s
1217:	learn: 0.3550569	total: 7m 22s	remaining: 1m 42s
1218:	learn: 0.3550116	total: 7m 23s	remaining: 1m 42s
1219:	learn: 0.3549762	total: 7m 23s	remaining: 1m 41s
1220:	learn: 0.3549253	total: 7m 24s	remaining: 1m 41s
1221:	learn: 0.3548667	total: 7m 24s	remaining: 1m 41s
1222:	learn: 0.3548266	total: 7m 24s	remaining: 1m 40s
1223:	learn: 0.3547883	total: 7m 25s	remaining: 1m 40s
1224:	learn: 0.3547473	total: 7m 25s	remaining: 1m 39s
1225:	learn: 0.3547098	total: 7m 25s	remaining: 1m 39s
1226:	learn: 0.3546636	total: 7m 26s	remaining: 1m 39s
1227:	learn: 0.3546150	total: 7m 26s	remaining: 1m 38s
1228:	learn: 0.3545614	total: 7m 26s	remaining: 1m 38s
1229:	learn: 0.3545083	total: 7m 27s	remaining: 1m 38s
1230:	learn: 0.3544438	total: 7m 27s	remaining: 1m 37s
1231:	learn: 0.3543934	total: 7m 27s	remaining: 1m 37s
1232:	learn: 0.3543538	total: 7m 28s	remaining: 1m 37s
1233:	learn: 0.3543003	total: 7m 28s	remaining: 1m 36s
1234:	learn: 0.3542527	total: 7m 28s	remaining: 1m 36s
1235:	learn: 0.3542064	total: 7m 29s	remaining: 1m 35s
1236:	learn: 0.3541536	total: 7m 29s	remaining: 1m 35s
1237:	learn: 0.3541176	total: 7m 29s	remaining: 1m 35s
1238:	learn: 0.3540786	total: 7m 30s	remaining: 1m 34s
1239:	learn: 0.3540437	total: 7m 30s	remaining: 1m 34s
1240:	learn: 0.3540013	total: 7m 30s	remaining: 1m 34s
1241:	learn: 0.3539543	total: 7m 31s	remaining: 1m 33s
1242:	learn: 0.3539078	total: 7m 31s	remaining: 1m 33s
1243:	learn: 0.3538627	total: 7m 32s	remaining: 1m 33s
1244:	learn: 0.3538106	total: 7m 32s	remaining: 1m 32s
1245:	learn: 0.3537607	total: 7m 33s	remaining: 1m 32s
1246:	learn: 0.3537056	total: 7m 33s	remaining: 1m 32s
1247:	learn: 0.3536510	total: 7m 34s	remaining: 1m 31s
1248:	learn: 0.3536036	total: 7m 34s	remaining: 1m 31s
1249:	learn: 0.3535559	total: 7m 34s	remaining: 1m 30s
1250:	learn: 0.3535161	total: 7m 35s	remaining: 1m 30s
1251:	learn: 0.3534765	total: 7m 35s	remaining: 1m 30s
1252:	learn: 0.3534336	total: 7m 35s	remaining: 1m 29s
1253:	learn: 0.3533972	total: 7m 36s	remaining: 1m 29s
1254:	learn: 0.3533402	total: 7m 36s	remaining: 1m 29s
1255:	learn: 0.3532861	total: 7m 37s	remaining: 1m 28s
1256:	learn: 0.3532606	total: 7m 37s	remaining: 1m 28s
1257:	learn: 0.3531995	total: 7m 37s	remaining: 1m 28s
1258:	learn: 0.3531720	total: 7m 38s	remaining: 1m 27s
1259:	learn: 0.3531175	total: 7m 38s	remaining: 1m 27s
1260:	learn: 0.3530654	total: 7m 38s	remaining: 1m 26s
1261:	learn: 0.3530106	total: 7m 39s	remaining: 1m 26s
1262:	learn: 0.3529725	total: 7m 39s	remaining: 1m 26s
1263:	learn: 0.3529363	total: 7m 39s	remaining: 1m 25s
1264:	learn: 0.3528916	total: 7m 40s	remaining: 1m 25s
1265:	learn: 0.3528430	total: 7m 40s	remaining: 1m 25s
1266:	learn: 0.3528087	total: 7m 40s	remaining: 1m 24s
1267:	learn: 0.3527761	total: 7m 41s	remaining: 1m 24s
1268:	learn: 0.3527298	total: 7m 41s	remaining: 1m 23s
1269:	learn: 0.3526828	total: 7m 41s	remaining: 1m 23s
1270:	learn: 0.3526458	total: 7m 42s	remaining: 1m 23s
1271:	learn: 0.3525985	total: 7m 42s	remaining: 1m 22s
1272:	learn: 0.3525616	total: 7m 42s	remaining: 1m 22s
1273:	learn: 0.3525174	total: 7m 43s	remaining: 1m 22s
1274:	learn: 0.3524746	total: 7m 43s	remaining: 1m 21s
1275:	learn: 0.3524202	total: 7m 43s	remaining: 1m 21s
1276:	learn: 0.3523747	total: 7m 44s	remaining: 1m 21s
1277:	learn: 0.3523250	total: 7m 44s	remaining: 1m 20s
1278:	learn: 0.3522742	total: 7m 44s	remaining: 1m 20s
1279:	learn: 0.3522178	total: 7m 45s	remaining: 1m 19s
1280:	learn: 0.3521881	total: 7m 45s	remaining: 1m 19s
1281:	learn: 0.3521445	total: 7m 45s	remaining: 1m 19s
1282:	learn: 0.3520961	total: 7m 46s	remaining: 1m 18s
1283:	learn: 0.3520344	total: 7m 46s	remaining: 1m 18s
1284:	learn: 0.3519574	total: 7m 47s	remaining: 1m 18s
1285:	learn: 0.3519159	total: 7m 47s	remaining: 1m 17s
1286:	learn: 0.3518737	total: 7m 47s	remaining: 1m 17s
1287:	learn: 0.3518288	total: 7m 48s	remaining: 1m 17s
1288:	learn: 0.3517851	total: 7m 48s	remaining: 1m 16s
1289:	learn: 0.3517328	total: 7m 48s	remaining: 1m 16s
1290:	learn: 0.3516833	total: 7m 49s	remaining: 1m 15s
1291:	learn: 0.3516494	total: 7m 49s	remaining: 1m 15s
1292:	learn: 0.3515897	total: 7m 50s	remaining: 1m 15s
1293:	learn: 0.3515439	total: 7m 50s	remaining: 1m 14s
1294:	learn: 0.3515047	total: 7m 50s	remaining: 1m 14s
1295:	learn: 0.3514567	total: 7m 51s	remaining: 1m 14s
1296:	learn: 0.3514058	total: 7m 51s	remaining: 1m 13s
1297:	learn: 0.3513531	total: 7m 51s	remaining: 1m 13s
1298:	learn: 0.3513278	total: 7m 52s	remaining: 1m 13s
1299:	learn: 0.3512850	total: 7m 52s	remaining: 1m 12s
1300:	learn: 0.3512286	total: 7m 52s	remaining: 1m 12s
1301:	learn: 0.3511960	total: 7m 53s	remaining: 1m 11s
1302:	learn: 0.3511399	total: 7m 53s	remaining: 1m 11s
1303:	learn: 0.3511050	total: 7m 53s	remaining: 1m 11s
1304:	learn: 0.3510575	total: 7m 54s	remaining: 1m 10s
1305:	learn: 0.3510180	total: 7m 54s	remaining: 1m 10s
1306:	learn: 0.3509683	total: 7m 55s	remaining: 1m 10s
1307:	learn: 0.3509461	total: 7m 55s	remaining: 1m 9s
1308:	learn: 0.3509108	total: 7m 55s	remaining: 1m 9s
1309:	learn: 0.3508815	total: 7m 55s	remaining: 1m 9s
1310:	learn: 0.3508347	total: 7m 56s	remaining: 1m 8s
1311:	learn: 0.3507832	total: 7m 56s	remaining: 1m 8s
1312:	learn: 0.3507362	total: 7m 57s	remaining: 1m 7s
1313:	learn: 0.3506951	total: 7m 57s	remaining: 1m 7s
1314:	learn: 0.3506470	total: 7m 57s	remaining: 1m 7s
1315:	learn: 0.3506164	total: 7m 58s	remaining: 1m 6s
1316:	learn: 0.3505825	total: 7m 58s	remaining: 1m 6s
1317:	learn: 0.3505280	total: 7m 58s	remaining: 1m 6s
1318:	learn: 0.3504800	total: 7m 59s	remaining: 1m 5s
1319:	learn: 0.3504411	total: 7m 59s	remaining: 1m 5s
1320:	learn: 0.3503951	total: 7m 59s	remaining: 1m 5s
1321:	learn: 0.3503532	total: 8m	remaining: 1m 4s
1322:	learn: 0.3503132	total: 8m	remaining: 1m 4s
1323:	learn: 0.3502720	total: 8m	remaining: 1m 3s
1324:	learn: 0.3502199	total: 8m 1s	remaining: 1m 3s
1325:	learn: 0.3501803	total: 8m 1s	remaining: 1m 3s
1326:	learn: 0.3501548	total: 8m 1s	remaining: 1m 2s
1327:	learn: 0.3501046	total: 8m 2s	remaining: 1m 2s
1328:	learn: 0.3500588	total: 8m 2s	remaining: 1m 2s
1329:	learn: 0.3500029	total: 8m 2s	remaining: 1m 1s
1330:	learn: 0.3499738	total: 8m 3s	remaining: 1m 1s
1331:	learn: 0.3499352	total: 8m 3s	remaining: 1m
1332:	learn: 0.3498938	total: 8m 3s	remaining: 1m
1333:	learn: 0.3498491	total: 8m 4s	remaining: 1m
1334:	learn: 0.3498092	total: 8m 4s	remaining: 59.9s
1335:	learn: 0.3497674	total: 8m 4s	remaining: 59.5s
1336:	learn: 0.3497380	total: 8m 5s	remaining: 59.1s
1337:	learn: 0.3496815	total: 8m 5s	remaining: 58.8s
1338:	learn: 0.3496353	total: 8m 5s	remaining: 58.4s
1339:	learn: 0.3495903	total: 8m 6s	remaining: 58.1s
1340:	learn: 0.3495454	total: 8m 6s	remaining: 57.7s
1341:	learn: 0.3495024	total: 8m 7s	remaining: 57.4s
1342:	learn: 0.3494667	total: 8m 7s	remaining: 57s
1343:	learn: 0.3494181	total: 8m 7s	remaining: 56.6s
1344:	learn: 0.3493687	total: 8m 8s	remaining: 56.3s
1345:	learn: 0.3493206	total: 8m 8s	remaining: 55.9s
1346:	learn: 0.3492866	total: 8m 8s	remaining: 55.5s
1347:	learn: 0.3492506	total: 8m 9s	remaining: 55.2s
1348:	learn: 0.3492035	total: 8m 9s	remaining: 54.8s
1349:	learn: 0.3491525	total: 8m 9s	remaining: 54.4s
1350:	learn: 0.3491129	total: 8m 10s	remaining: 54.1s
1351:	learn: 0.3490616	total: 8m 10s	remaining: 53.7s
1352:	learn: 0.3490173	total: 8m 10s	remaining: 53.3s
1353:	learn: 0.3489803	total: 8m 11s	remaining: 53s
1354:	learn: 0.3489393	total: 8m 11s	remaining: 52.6s
1355:	learn: 0.3488893	total: 8m 11s	remaining: 52.2s
1356:	learn: 0.3488439	total: 8m 12s	remaining: 51.9s
1357:	learn: 0.3487940	total: 8m 12s	remaining: 51.5s
1358:	learn: 0.3487384	total: 8m 12s	remaining: 51.1s
1359:	learn: 0.3486946	total: 8m 13s	remaining: 50.8s
1360:	learn: 0.3486468	total: 8m 13s	remaining: 50.4s
1361:	learn: 0.3486126	total: 8m 13s	remaining: 50s
1362:	learn: 0.3485714	total: 8m 14s	remaining: 49.7s
1363:	learn: 0.3485231	total: 8m 14s	remaining: 49.3s
1364:	learn: 0.3484779	total: 8m 14s	remaining: 48.9s
1365:	learn: 0.3484387	total: 8m 15s	remaining: 48.6s
1366:	learn: 0.3483820	total: 8m 15s	remaining: 48.2s
1367:	learn: 0.3483349	total: 8m 15s	remaining: 47.9s
1368:	learn: 0.3482889	total: 8m 16s	remaining: 47.5s
1369:	learn: 0.3482462	total: 8m 16s	remaining: 47.1s
1370:	learn: 0.3482024	total: 8m 17s	remaining: 46.8s
1371:	learn: 0.3481628	total: 8m 17s	remaining: 46.4s
1372:	learn: 0.3481296	total: 8m 17s	remaining: 46s
1373:	learn: 0.3480884	total: 8m 18s	remaining: 45.7s
1374:	learn: 0.3480572	total: 8m 18s	remaining: 45.3s
1375:	learn: 0.3480189	total: 8m 18s	remaining: 44.9s
1376:	learn: 0.3479851	total: 8m 18s	remaining: 44.6s
1377:	learn: 0.3479366	total: 8m 19s	remaining: 44.2s
1378:	learn: 0.3478885	total: 8m 19s	remaining: 43.8s
1379:	learn: 0.3478405	total: 8m 19s	remaining: 43.5s
1380:	learn: 0.3477930	total: 8m 20s	remaining: 43.1s
1381:	learn: 0.3477464	total: 8m 20s	remaining: 42.8s
1382:	learn: 0.3477041	total: 8m 21s	remaining: 42.4s
1383:	learn: 0.3476670	total: 8m 21s	remaining: 42s
1384:	learn: 0.3476174	total: 8m 22s	remaining: 41.7s
1385:	learn: 0.3475747	total: 8m 22s	remaining: 41.3s
1386:	learn: 0.3475322	total: 8m 22s	remaining: 41s
1387:	learn: 0.3474887	total: 8m 23s	remaining: 40.6s
1388:	learn: 0.3474365	total: 8m 23s	remaining: 40.2s
1389:	learn: 0.3473993	total: 8m 23s	remaining: 39.9s
1390:	learn: 0.3473450	total: 8m 24s	remaining: 39.5s
1391:	learn: 0.3473089	total: 8m 24s	remaining: 39.1s
1392:	learn: 0.3472756	total: 8m 24s	remaining: 38.8s
1393:	learn: 0.3472330	total: 8m 25s	remaining: 38.4s
1394:	learn: 0.3471986	total: 8m 25s	remaining: 38s
1395:	learn: 0.3471531	total: 8m 25s	remaining: 37.7s
1396:	learn: 0.3470978	total: 8m 26s	remaining: 37.3s
1397:	learn: 0.3470508	total: 8m 26s	remaining: 37s
1398:	learn: 0.3470029	total: 8m 26s	remaining: 36.6s
1399:	learn: 0.3469460	total: 8m 27s	remaining: 36.2s
1400:	learn: 0.3469099	total: 8m 27s	remaining: 35.9s
1401:	learn: 0.3468597	total: 8m 27s	remaining: 35.5s
1402:	learn: 0.3468106	total: 8m 28s	remaining: 35.1s
1403:	learn: 0.3467686	total: 8m 28s	remaining: 34.8s
1404:	learn: 0.3467380	total: 8m 28s	remaining: 34.4s
1405:	learn: 0.3466915	total: 8m 29s	remaining: 34s
1406:	learn: 0.3466368	total: 8m 29s	remaining: 33.7s
1407:	learn: 0.3465906	total: 8m 29s	remaining: 33.3s
1408:	learn: 0.3465313	total: 8m 30s	remaining: 33s
1409:	learn: 0.3464862	total: 8m 30s	remaining: 32.6s
1410:	learn: 0.3464556	total: 8m 31s	remaining: 32.2s
1411:	learn: 0.3464006	total: 8m 31s	remaining: 31.9s
1412:	learn: 0.3463667	total: 8m 31s	remaining: 31.5s
1413:	learn: 0.3463156	total: 8m 32s	remaining: 31.1s
1414:	learn: 0.3462803	total: 8m 32s	remaining: 30.8s
1415:	learn: 0.3462464	total: 8m 32s	remaining: 30.4s
1416:	learn: 0.3462395	total: 8m 32s	remaining: 30s
1417:	learn: 0.3462019	total: 8m 33s	remaining: 29.7s
1418:	learn: 0.3461558	total: 8m 33s	remaining: 29.3s
1419:	learn: 0.3461283	total: 8m 33s	remaining: 29s
1420:	learn: 0.3460856	total: 8m 34s	remaining: 28.6s
1421:	learn: 0.3460402	total: 8m 34s	remaining: 28.2s
1422:	learn: 0.3459935	total: 8m 35s	remaining: 27.9s
1423:	learn: 0.3459560	total: 8m 35s	remaining: 27.5s
1424:	learn: 0.3459213	total: 8m 35s	remaining: 27.1s
1425:	learn: 0.3458680	total: 8m 35s	remaining: 26.8s
1426:	learn: 0.3458209	total: 8m 36s	remaining: 26.4s
1427:	learn: 0.3457670	total: 8m 36s	remaining: 26.1s
1428:	learn: 0.3457330	total: 8m 37s	remaining: 25.7s
1429:	learn: 0.3456941	total: 8m 37s	remaining: 25.3s
1430:	learn: 0.3456630	total: 8m 37s	remaining: 25s
1431:	learn: 0.3456244	total: 8m 38s	remaining: 24.6s
1432:	learn: 0.3455733	total: 8m 38s	remaining: 24.2s
1433:	learn: 0.3455337	total: 8m 38s	remaining: 23.9s
1434:	learn: 0.3454857	total: 8m 39s	remaining: 23.5s
1435:	learn: 0.3454465	total: 8m 39s	remaining: 23.2s
1436:	learn: 0.3453996	total: 8m 40s	remaining: 22.8s
1437:	learn: 0.3453583	total: 8m 40s	remaining: 22.4s
1438:	learn: 0.3452994	total: 8m 40s	remaining: 22.1s
1439:	learn: 0.3452702	total: 8m 41s	remaining: 21.7s
1440:	learn: 0.3452194	total: 8m 41s	remaining: 21.4s
1441:	learn: 0.3451819	total: 8m 41s	remaining: 21s
1442:	learn: 0.3451517	total: 8m 42s	remaining: 20.6s
1443:	learn: 0.3451034	total: 8m 42s	remaining: 20.3s
1444:	learn: 0.3450576	total: 8m 43s	remaining: 19.9s
1445:	learn: 0.3450069	total: 8m 43s	remaining: 19.5s
1446:	learn: 0.3449674	total: 8m 43s	remaining: 19.2s
1447:	learn: 0.3449203	total: 8m 44s	remaining: 18.8s
1448:	learn: 0.3448831	total: 8m 44s	remaining: 18.5s
1449:	learn: 0.3448466	total: 8m 44s	remaining: 18.1s
1450:	learn: 0.3447996	total: 8m 45s	remaining: 17.7s
1451:	learn: 0.3447541	total: 8m 45s	remaining: 17.4s
1452:	learn: 0.3447118	total: 8m 46s	remaining: 17s
1453:	learn: 0.3446614	total: 8m 46s	remaining: 16.7s
1454:	learn: 0.3446248	total: 8m 46s	remaining: 16.3s
1455:	learn: 0.3445717	total: 8m 47s	remaining: 15.9s
1456:	learn: 0.3445228	total: 8m 47s	remaining: 15.6s
1457:	learn: 0.3444715	total: 8m 48s	remaining: 15.2s
1458:	learn: 0.3444334	total: 8m 48s	remaining: 14.9s
1459:	learn: 0.3443840	total: 8m 48s	remaining: 14.5s
1460:	learn: 0.3443435	total: 8m 49s	remaining: 14.1s
1461:	learn: 0.3443063	total: 8m 49s	remaining: 13.8s
1462:	learn: 0.3442657	total: 8m 49s	remaining: 13.4s
1463:	learn: 0.3442133	total: 8m 50s	remaining: 13s
1464:	learn: 0.3441755	total: 8m 50s	remaining: 12.7s
1465:	learn: 0.3441319	total: 8m 50s	remaining: 12.3s
1466:	learn: 0.3440798	total: 8m 51s	remaining: 11.9s
1467:	learn: 0.3440453	total: 8m 51s	remaining: 11.6s
1468:	learn: 0.3440115	total: 8m 51s	remaining: 11.2s
1469:	learn: 0.3439718	total: 8m 52s	remaining: 10.9s
1470:	learn: 0.3439218	total: 8m 52s	remaining: 10.5s
1471:	learn: 0.3438775	total: 8m 52s	remaining: 10.1s
1472:	learn: 0.3438365	total: 8m 53s	remaining: 9.78s
1473:	learn: 0.3437940	total: 8m 53s	remaining: 9.41s
1474:	learn: 0.3437568	total: 8m 54s	remaining: 9.05s
1475:	learn: 0.3437007	total: 8m 54s	remaining: 8.69s
1476:	learn: 0.3436497	total: 8m 55s	remaining: 8.33s
1477:	learn: 0.3436026	total: 8m 55s	remaining: 7.97s
1478:	learn: 0.3435802	total: 8m 55s	remaining: 7.61s
1479:	learn: 0.3435345	total: 8m 56s	remaining: 7.25s
1480:	learn: 0.3434916	total: 8m 56s	remaining: 6.88s
1481:	learn: 0.3434506	total: 8m 57s	remaining: 6.52s
1482:	learn: 0.3434068	total: 8m 57s	remaining: 6.16s
1483:	learn: 0.3433758	total: 8m 57s	remaining: 5.8s
1484:	learn: 0.3433440	total: 8m 58s	remaining: 5.43s
1485:	learn: 0.3432962	total: 8m 58s	remaining: 5.07s
1486:	learn: 0.3432430	total: 8m 59s	remaining: 4.71s
1487:	learn: 0.3432152	total: 8m 59s	remaining: 4.35s
1488:	learn: 0.3431653	total: 8m 59s	remaining: 3.99s
1489:	learn: 0.3431294	total: 9m	remaining: 3.62s
1490:	learn: 0.3430839	total: 9m	remaining: 3.26s
1491:	learn: 0.3430382	total: 9m	remaining: 2.9s
1492:	learn: 0.3429990	total: 9m 1s	remaining: 2.54s
1493:	learn: 0.3429510	total: 9m 1s	remaining: 2.17s
1494:	learn: 0.3429206	total: 9m 1s	remaining: 1.81s
1495:	learn: 0.3428776	total: 9m 2s	remaining: 1.45s
1496:	learn: 0.3428462	total: 9m 2s	remaining: 1.09s
1497:	learn: 0.3427992	total: 9m 3s	remaining: 725ms
1498:	learn: 0.3427448	total: 9m 3s	remaining: 362ms
1499:	learn: 0.3426978	total: 9m 3s	remaining: 0us
Wall time: 22min 43s
Out[15]:
VotingClassifier(estimators=[('xgbc',
                              XGBClassifier(base_score=None, booster='gbtree',
                                            colsample_bylevel=None,
                                            colsample_bynode=1,
                                            colsample_bytree=1,
                                            enable_categorical=False, gamma=0,
                                            gpu_id=None, importance_type=None,
                                            interaction_constraints=None,
                                            learning_rate=0.3,
                                            max_delta_step=None, max_depth=6,
                                            min_child_weight=None, missing=nan,
                                            monotone_constraints=None,
                                            n_estimators=103, n_jobs=None,
                                            num_parallel_tree=None,
                                            predictor=None, random_state=57,
                                            reg_alpha=None, reg_lambda=None,
                                            scale_pos_weight=None, subsample=1,
                                            tree_method=None,
                                            validate_parameters=None,
                                            verbosity=None)),
                             ('lgbc',
                              LGBMClassifier(n_estimators=2000,
                                             objective='binary',
                                             random_state=57)),
                             ('catgbc',
                              <catboost.core.CatBoostClassifier object at 0x0000020E12E09EB0>)])

Compute FPR + FNR score

In [16]:
enlarged_voting_clf_y_pred = voting_clf_enlarged.predict(enlarged_voting_gbc_X_valid)
valid_score = criterion(enlarged_voting_clf_y_pred, enlarged_voting_gbc_y_valid)
print('FPR + FNR = {}'.format(valid_score))
FPR + FNR = 0.37937935017824587

Result discussion

The $FPR$ and $FNR$ rate is the lower for this notebook: 0.379. The voting system provides that encompass Three boosting models train on the enlarged give better performance than XGBoost, CatBoost and LightGBM on the binary classification task. We therefore deduce taht the generation of new features seems to improve the performance of the model.

Fit the best model on the entire dataset

We prepare the submission of the best model by training the model on all the data at our disposal without splitting the dataset.

The best performing model is a voting model that which includes 3 boosting algorithms

  • XGBoost
  • LightGBM
  • CatBoost

With a majority voting system

Fitting the best model

In [18]:
%%time
best_estimators = [
    ('xgbc', XGBClassifier(booster='gbtree', learning_rate=0.3, 
                     max_depth=6, n_estimators=103, 
                     colsample_bynode=1, colsample_bytree=1,
                     subsample=1, gamma=0, 
                     objective='binary:logistic', random_state=57)),
    
    ('lgbc', LGBMClassifier(objective= 'binary', 
                            n_estimators = 2000, random_state=57)),
    
    ('catgbc', CatBoostClassifier(eval_metric= 'Logloss', iterations= 1500, 
                                  learning_rate= 0.1, subsample= 0.8, random_state=57))
]

best_model = VotingClassifier(estimators=best_estimators, voting='hard')

best_model.fit(X_dataframe_enlarged, y)
[15:51:44] WARNING: C:/Users/Administrator/workspace/xgboost-win64_release_1.5.1/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior.
0:	learn: 0.6611588	total: 446ms	remaining: 11m 8s
1:	learn: 0.6365899	total: 885ms	remaining: 11m 2s
2:	learn: 0.6151279	total: 1.51s	remaining: 12m 34s
3:	learn: 0.5980573	total: 1.95s	remaining: 12m 10s
4:	learn: 0.5829017	total: 2.53s	remaining: 12m 36s
5:	learn: 0.5709179	total: 2.96s	remaining: 12m 16s
6:	learn: 0.5610373	total: 3.36s	remaining: 11m 55s
7:	learn: 0.5529757	total: 3.82s	remaining: 11m 52s
8:	learn: 0.5447953	total: 4.32s	remaining: 11m 54s
9:	learn: 0.5370827	total: 4.84s	remaining: 12m
10:	learn: 0.5314296	total: 5.21s	remaining: 11m 44s
11:	learn: 0.5265824	total: 5.65s	remaining: 11m 40s
12:	learn: 0.5221126	total: 6.17s	remaining: 11m 45s
13:	learn: 0.5171303	total: 6.72s	remaining: 11m 53s
14:	learn: 0.5129953	total: 7.2s	remaining: 11m 53s
15:	learn: 0.5096825	total: 7.59s	remaining: 11m 44s
16:	learn: 0.5058503	total: 8.1s	remaining: 11m 46s
17:	learn: 0.5029005	total: 8.57s	remaining: 11m 45s
18:	learn: 0.4999548	total: 8.95s	remaining: 11m 37s
19:	learn: 0.4968038	total: 9.59s	remaining: 11m 49s
20:	learn: 0.4945458	total: 9.92s	remaining: 11m 38s
21:	learn: 0.4915481	total: 10.5s	remaining: 11m 42s
22:	learn: 0.4888905	total: 10.8s	remaining: 11m 34s
23:	learn: 0.4866482	total: 11.3s	remaining: 11m 32s
24:	learn: 0.4846385	total: 11.6s	remaining: 11m 26s
25:	learn: 0.4826956	total: 12.1s	remaining: 11m 26s
26:	learn: 0.4809004	total: 12.5s	remaining: 11m 24s
27:	learn: 0.4790002	total: 13.1s	remaining: 11m 26s
28:	learn: 0.4770126	total: 13.6s	remaining: 11m 27s
29:	learn: 0.4753694	total: 13.9s	remaining: 11m 22s
30:	learn: 0.4739668	total: 14.3s	remaining: 11m 19s
31:	learn: 0.4723943	total: 14.9s	remaining: 11m 22s
32:	learn: 0.4710432	total: 15.2s	remaining: 11m 17s
33:	learn: 0.4697410	total: 15.7s	remaining: 11m 16s
34:	learn: 0.4687035	total: 16.1s	remaining: 11m 14s
35:	learn: 0.4675388	total: 16.4s	remaining: 11m 7s
36:	learn: 0.4665751	total: 16.8s	remaining: 11m 5s
37:	learn: 0.4655575	total: 17.2s	remaining: 11m 2s
38:	learn: 0.4641191	total: 17.8s	remaining: 11m 8s
39:	learn: 0.4628306	total: 18.4s	remaining: 11m 11s
40:	learn: 0.4618577	total: 18.8s	remaining: 11m 9s
41:	learn: 0.4609783	total: 19.3s	remaining: 11m 9s
42:	learn: 0.4602470	total: 19.7s	remaining: 11m 6s
43:	learn: 0.4591470	total: 20.1s	remaining: 11m 5s
44:	learn: 0.4581572	total: 20.6s	remaining: 11m 5s
45:	learn: 0.4572215	total: 21.1s	remaining: 11m 6s
46:	learn: 0.4565394	total: 21.5s	remaining: 11m 3s
47:	learn: 0.4558592	total: 21.8s	remaining: 11m
48:	learn: 0.4551848	total: 22.2s	remaining: 10m 58s
49:	learn: 0.4544369	total: 22.9s	remaining: 11m 2s
50:	learn: 0.4536893	total: 23.3s	remaining: 11m 2s
51:	learn: 0.4530051	total: 23.8s	remaining: 11m 2s
52:	learn: 0.4523225	total: 24.2s	remaining: 11m
53:	learn: 0.4515679	total: 24.8s	remaining: 11m 3s
54:	learn: 0.4509391	total: 25.2s	remaining: 11m 2s
55:	learn: 0.4504344	total: 25.6s	remaining: 11m
56:	learn: 0.4499197	total: 26s	remaining: 10m 58s
57:	learn: 0.4494307	total: 26.4s	remaining: 10m 55s
58:	learn: 0.4489795	total: 26.7s	remaining: 10m 53s
59:	learn: 0.4481455	total: 27.3s	remaining: 10m 55s
60:	learn: 0.4474399	total: 27.8s	remaining: 10m 54s
61:	learn: 0.4468463	total: 28.3s	remaining: 10m 55s
62:	learn: 0.4463946	total: 28.7s	remaining: 10m 54s
63:	learn: 0.4459042	total: 29.1s	remaining: 10m 53s
64:	learn: 0.4454710	total: 29.6s	remaining: 10m 53s
65:	learn: 0.4450077	total: 29.9s	remaining: 10m 49s
66:	learn: 0.4444788	total: 30.4s	remaining: 10m 49s
67:	learn: 0.4438580	total: 30.8s	remaining: 10m 49s
68:	learn: 0.4434252	total: 31.2s	remaining: 10m 47s
69:	learn: 0.4430749	total: 31.6s	remaining: 10m 46s
70:	learn: 0.4425822	total: 32s	remaining: 10m 44s
71:	learn: 0.4421582	total: 32.4s	remaining: 10m 43s
72:	learn: 0.4417505	total: 32.9s	remaining: 10m 42s
73:	learn: 0.4412594	total: 33.3s	remaining: 10m 41s
74:	learn: 0.4408368	total: 33.6s	remaining: 10m 39s
75:	learn: 0.4405047	total: 34.1s	remaining: 10m 38s
76:	learn: 0.4402149	total: 34.5s	remaining: 10m 38s
77:	learn: 0.4398839	total: 34.9s	remaining: 10m 36s
78:	learn: 0.4395284	total: 35.3s	remaining: 10m 35s
79:	learn: 0.4391788	total: 35.6s	remaining: 10m 32s
80:	learn: 0.4389093	total: 36s	remaining: 10m 30s
81:	learn: 0.4386019	total: 36.3s	remaining: 10m 28s
82:	learn: 0.4383357	total: 36.7s	remaining: 10m 26s
83:	learn: 0.4380854	total: 37.1s	remaining: 10m 24s
84:	learn: 0.4377909	total: 37.5s	remaining: 10m 23s
85:	learn: 0.4375083	total: 37.8s	remaining: 10m 21s
86:	learn: 0.4371983	total: 38.2s	remaining: 10m 21s
87:	learn: 0.4367415	total: 38.7s	remaining: 10m 21s
88:	learn: 0.4364501	total: 39.3s	remaining: 10m 23s
89:	learn: 0.4362124	total: 39.8s	remaining: 10m 24s
90:	learn: 0.4359129	total: 40.3s	remaining: 10m 24s
91:	learn: 0.4356833	total: 40.8s	remaining: 10m 23s
92:	learn: 0.4354152	total: 41.3s	remaining: 10m 25s
93:	learn: 0.4351700	total: 41.7s	remaining: 10m 24s
94:	learn: 0.4349696	total: 42.2s	remaining: 10m 24s
95:	learn: 0.4346880	total: 42.7s	remaining: 10m 24s
96:	learn: 0.4344034	total: 43.2s	remaining: 10m 24s
97:	learn: 0.4341834	total: 43.6s	remaining: 10m 23s
98:	learn: 0.4339310	total: 44s	remaining: 10m 22s
99:	learn: 0.4336686	total: 44.5s	remaining: 10m 22s
100:	learn: 0.4333562	total: 44.9s	remaining: 10m 21s
101:	learn: 0.4331791	total: 45.2s	remaining: 10m 19s
102:	learn: 0.4329965	total: 45.6s	remaining: 10m 18s
103:	learn: 0.4328012	total: 46.1s	remaining: 10m 18s
104:	learn: 0.4326067	total: 46.4s	remaining: 10m 16s
105:	learn: 0.4323306	total: 46.8s	remaining: 10m 16s
106:	learn: 0.4320159	total: 47.3s	remaining: 10m 16s
107:	learn: 0.4318275	total: 47.7s	remaining: 10m 14s
108:	learn: 0.4315211	total: 48.2s	remaining: 10m 14s
109:	learn: 0.4312783	total: 48.6s	remaining: 10m 14s
110:	learn: 0.4310964	total: 49.1s	remaining: 10m 13s
111:	learn: 0.4309048	total: 49.5s	remaining: 10m 12s
112:	learn: 0.4307029	total: 49.9s	remaining: 10m 12s
113:	learn: 0.4304532	total: 50.4s	remaining: 10m 13s
114:	learn: 0.4301977	total: 50.9s	remaining: 10m 13s
115:	learn: 0.4300227	total: 51.4s	remaining: 10m 12s
116:	learn: 0.4297982	total: 51.7s	remaining: 10m 11s
117:	learn: 0.4296270	total: 52s	remaining: 10m 9s
118:	learn: 0.4293765	total: 52.5s	remaining: 10m 9s
119:	learn: 0.4291236	total: 53s	remaining: 10m 9s
120:	learn: 0.4289564	total: 53.4s	remaining: 10m 8s
121:	learn: 0.4287667	total: 53.8s	remaining: 10m 7s
122:	learn: 0.4285339	total: 54.3s	remaining: 10m 7s
123:	learn: 0.4283582	total: 54.7s	remaining: 10m 7s
124:	learn: 0.4281649	total: 55.1s	remaining: 10m 6s
125:	learn: 0.4280066	total: 55.5s	remaining: 10m 5s
126:	learn: 0.4278106	total: 56s	remaining: 10m 5s
127:	learn: 0.4276096	total: 56.4s	remaining: 10m 4s
128:	learn: 0.4274593	total: 56.8s	remaining: 10m 4s
129:	learn: 0.4272586	total: 57.4s	remaining: 10m 4s
130:	learn: 0.4271472	total: 57.7s	remaining: 10m 3s
131:	learn: 0.4269647	total: 58.2s	remaining: 10m 3s
132:	learn: 0.4267423	total: 58.7s	remaining: 10m 2s
133:	learn: 0.4265263	total: 59.2s	remaining: 10m 2s
134:	learn: 0.4263351	total: 59.6s	remaining: 10m 2s
135:	learn: 0.4261407	total: 60s	remaining: 10m 1s
136:	learn: 0.4260009	total: 1m	remaining: 10m 1s
137:	learn: 0.4258308	total: 1m	remaining: 9m 59s
138:	learn: 0.4256232	total: 1m 1s	remaining: 9m 59s
139:	learn: 0.4254598	total: 1m 1s	remaining: 9m 58s
140:	learn: 0.4252080	total: 1m 2s	remaining: 9m 59s
141:	learn: 0.4250201	total: 1m 2s	remaining: 9m 59s
142:	learn: 0.4247949	total: 1m 3s	remaining: 10m 1s
143:	learn: 0.4246167	total: 1m 3s	remaining: 10m 1s
144:	learn: 0.4244757	total: 1m 4s	remaining: 10m 1s
145:	learn: 0.4243377	total: 1m 4s	remaining: 10m 1s
146:	learn: 0.4241763	total: 1m 5s	remaining: 10m 2s
147:	learn: 0.4239647	total: 1m 5s	remaining: 10m 2s
148:	learn: 0.4237755	total: 1m 6s	remaining: 10m 3s
149:	learn: 0.4236003	total: 1m 7s	remaining: 10m 4s
150:	learn: 0.4234229	total: 1m 7s	remaining: 10m 4s
151:	learn: 0.4232987	total: 1m 8s	remaining: 10m 3s
152:	learn: 0.4231684	total: 1m 8s	remaining: 10m 1s
153:	learn: 0.4229419	total: 1m 8s	remaining: 10m 1s
154:	learn: 0.4228126	total: 1m 9s	remaining: 10m
155:	learn: 0.4226296	total: 1m 9s	remaining: 10m 1s
156:	learn: 0.4224851	total: 1m 10s	remaining: 10m
157:	learn: 0.4223616	total: 1m 10s	remaining: 10m
158:	learn: 0.4221810	total: 1m 11s	remaining: 10m 1s
159:	learn: 0.4220037	total: 1m 11s	remaining: 10m 2s
160:	learn: 0.4218698	total: 1m 12s	remaining: 10m 2s
161:	learn: 0.4217290	total: 1m 12s	remaining: 10m 1s
162:	learn: 0.4215779	total: 1m 13s	remaining: 10m 1s
163:	learn: 0.4214405	total: 1m 13s	remaining: 10m 1s
164:	learn: 0.4212943	total: 1m 14s	remaining: 10m 1s
165:	learn: 0.4211472	total: 1m 14s	remaining: 10m
166:	learn: 0.4209492	total: 1m 15s	remaining: 10m
167:	learn: 0.4208351	total: 1m 15s	remaining: 9m 59s
168:	learn: 0.4207036	total: 1m 16s	remaining: 9m 59s
169:	learn: 0.4205785	total: 1m 16s	remaining: 9m 59s
170:	learn: 0.4204701	total: 1m 16s	remaining: 9m 57s
171:	learn: 0.4203550	total: 1m 17s	remaining: 9m 57s
172:	learn: 0.4201312	total: 1m 17s	remaining: 9m 56s
173:	learn: 0.4199978	total: 1m 18s	remaining: 9m 56s
174:	learn: 0.4198404	total: 1m 18s	remaining: 9m 55s
175:	learn: 0.4196616	total: 1m 19s	remaining: 9m 55s
176:	learn: 0.4195407	total: 1m 19s	remaining: 9m 54s
177:	learn: 0.4194146	total: 1m 19s	remaining: 9m 53s
178:	learn: 0.4192724	total: 1m 20s	remaining: 9m 53s
179:	learn: 0.4191596	total: 1m 20s	remaining: 9m 52s
180:	learn: 0.4190231	total: 1m 21s	remaining: 9m 52s
181:	learn: 0.4188717	total: 1m 21s	remaining: 9m 52s
182:	learn: 0.4187459	total: 1m 22s	remaining: 9m 51s
183:	learn: 0.4186362	total: 1m 22s	remaining: 9m 50s
184:	learn: 0.4185067	total: 1m 23s	remaining: 9m 50s
185:	learn: 0.4184142	total: 1m 23s	remaining: 9m 49s
186:	learn: 0.4183146	total: 1m 23s	remaining: 9m 48s
187:	learn: 0.4182008	total: 1m 24s	remaining: 9m 48s
188:	learn: 0.4180976	total: 1m 24s	remaining: 9m 46s
189:	learn: 0.4179838	total: 1m 24s	remaining: 9m 46s
190:	learn: 0.4178894	total: 1m 25s	remaining: 9m 45s
191:	learn: 0.4177393	total: 1m 25s	remaining: 9m 44s
192:	learn: 0.4176210	total: 1m 26s	remaining: 9m 45s
193:	learn: 0.4175033	total: 1m 26s	remaining: 9m 45s
194:	learn: 0.4173415	total: 1m 27s	remaining: 9m 45s
195:	learn: 0.4172315	total: 1m 27s	remaining: 9m 44s
196:	learn: 0.4171159	total: 1m 28s	remaining: 9m 44s
197:	learn: 0.4170299	total: 1m 28s	remaining: 9m 43s
198:	learn: 0.4169385	total: 1m 29s	remaining: 9m 42s
199:	learn: 0.4168477	total: 1m 29s	remaining: 9m 41s
200:	learn: 0.4167127	total: 1m 29s	remaining: 9m 41s
201:	learn: 0.4166327	total: 1m 30s	remaining: 9m 40s
202:	learn: 0.4165362	total: 1m 30s	remaining: 9m 39s
203:	learn: 0.4164427	total: 1m 31s	remaining: 9m 38s
204:	learn: 0.4163496	total: 1m 31s	remaining: 9m 38s
205:	learn: 0.4162372	total: 1m 31s	remaining: 9m 37s
206:	learn: 0.4161369	total: 1m 32s	remaining: 9m 36s
207:	learn: 0.4160393	total: 1m 32s	remaining: 9m 36s
208:	learn: 0.4159425	total: 1m 33s	remaining: 9m 35s
209:	learn: 0.4158449	total: 1m 33s	remaining: 9m 34s
210:	learn: 0.4157632	total: 1m 33s	remaining: 9m 33s
211:	learn: 0.4156764	total: 1m 34s	remaining: 9m 32s
212:	learn: 0.4155403	total: 1m 34s	remaining: 9m 32s
213:	learn: 0.4154257	total: 1m 35s	remaining: 9m 32s
214:	learn: 0.4153188	total: 1m 35s	remaining: 9m 31s
215:	learn: 0.4151907	total: 1m 36s	remaining: 9m 30s
216:	learn: 0.4150765	total: 1m 36s	remaining: 9m 30s
217:	learn: 0.4149944	total: 1m 36s	remaining: 9m 29s
218:	learn: 0.4148797	total: 1m 37s	remaining: 9m 29s
219:	learn: 0.4147948	total: 1m 37s	remaining: 9m 28s
220:	learn: 0.4147109	total: 1m 38s	remaining: 9m 28s
221:	learn: 0.4145625	total: 1m 38s	remaining: 9m 27s
222:	learn: 0.4144291	total: 1m 39s	remaining: 9m 27s
223:	learn: 0.4143306	total: 1m 39s	remaining: 9m 27s
224:	learn: 0.4142297	total: 1m 39s	remaining: 9m 26s
225:	learn: 0.4141483	total: 1m 40s	remaining: 9m 25s
226:	learn: 0.4140365	total: 1m 40s	remaining: 9m 25s
227:	learn: 0.4139186	total: 1m 41s	remaining: 9m 25s
228:	learn: 0.4138289	total: 1m 41s	remaining: 9m 24s
229:	learn: 0.4137519	total: 1m 42s	remaining: 9m 23s
230:	learn: 0.4136619	total: 1m 42s	remaining: 9m 22s
231:	learn: 0.4135407	total: 1m 42s	remaining: 9m 22s
232:	learn: 0.4134553	total: 1m 43s	remaining: 9m 21s
233:	learn: 0.4133651	total: 1m 43s	remaining: 9m 21s
234:	learn: 0.4132720	total: 1m 44s	remaining: 9m 21s
235:	learn: 0.4131939	total: 1m 44s	remaining: 9m 20s
236:	learn: 0.4131113	total: 1m 45s	remaining: 9m 20s
237:	learn: 0.4130111	total: 1m 45s	remaining: 9m 19s
238:	learn: 0.4129360	total: 1m 45s	remaining: 9m 18s
239:	learn: 0.4128622	total: 1m 46s	remaining: 9m 18s
240:	learn: 0.4127782	total: 1m 46s	remaining: 9m 18s
241:	learn: 0.4126726	total: 1m 47s	remaining: 9m 18s
242:	learn: 0.4125679	total: 1m 47s	remaining: 9m 17s
243:	learn: 0.4124613	total: 1m 48s	remaining: 9m 17s
244:	learn: 0.4123860	total: 1m 48s	remaining: 9m 16s
245:	learn: 0.4122890	total: 1m 49s	remaining: 9m 15s
246:	learn: 0.4122014	total: 1m 49s	remaining: 9m 15s
247:	learn: 0.4120825	total: 1m 50s	remaining: 9m 15s
248:	learn: 0.4120013	total: 1m 50s	remaining: 9m 14s
249:	learn: 0.4119058	total: 1m 50s	remaining: 9m 13s
250:	learn: 0.4118124	total: 1m 51s	remaining: 9m 13s
251:	learn: 0.4117126	total: 1m 51s	remaining: 9m 12s
252:	learn: 0.4116100	total: 1m 52s	remaining: 9m 13s
253:	learn: 0.4115065	total: 1m 52s	remaining: 9m 12s
254:	learn: 0.4114271	total: 1m 53s	remaining: 9m 11s
255:	learn: 0.4113533	total: 1m 53s	remaining: 9m 10s
256:	learn: 0.4112809	total: 1m 53s	remaining: 9m 10s
257:	learn: 0.4112049	total: 1m 54s	remaining: 9m 9s
258:	learn: 0.4111076	total: 1m 54s	remaining: 9m 9s
259:	learn: 0.4110328	total: 1m 55s	remaining: 9m 8s
260:	learn: 0.4109715	total: 1m 55s	remaining: 9m 7s
261:	learn: 0.4109018	total: 1m 55s	remaining: 9m 7s
262:	learn: 0.4108141	total: 1m 56s	remaining: 9m 6s
263:	learn: 0.4107396	total: 1m 56s	remaining: 9m 6s
264:	learn: 0.4106460	total: 1m 57s	remaining: 9m 5s
265:	learn: 0.4105651	total: 1m 57s	remaining: 9m 4s
266:	learn: 0.4105085	total: 1m 57s	remaining: 9m 3s
267:	learn: 0.4104292	total: 1m 58s	remaining: 9m 3s
268:	learn: 0.4103356	total: 1m 58s	remaining: 9m 4s
269:	learn: 0.4102501	total: 1m 59s	remaining: 9m 3s
270:	learn: 0.4101731	total: 1m 59s	remaining: 9m 3s
271:	learn: 0.4101047	total: 2m	remaining: 9m 2s
272:	learn: 0.4100287	total: 2m	remaining: 9m 2s
273:	learn: 0.4099501	total: 2m 1s	remaining: 9m 2s
274:	learn: 0.4098782	total: 2m 1s	remaining: 9m 1s
275:	learn: 0.4097891	total: 2m 2s	remaining: 9m 1s
276:	learn: 0.4097198	total: 2m 2s	remaining: 9m
277:	learn: 0.4096602	total: 2m 2s	remaining: 8m 59s
278:	learn: 0.4095679	total: 2m 3s	remaining: 8m 59s
279:	learn: 0.4095154	total: 2m 3s	remaining: 8m 58s
280:	learn: 0.4094432	total: 2m 3s	remaining: 8m 57s
281:	learn: 0.4093783	total: 2m 4s	remaining: 8m 56s
282:	learn: 0.4093152	total: 2m 4s	remaining: 8m 56s
283:	learn: 0.4092423	total: 2m 5s	remaining: 8m 55s
284:	learn: 0.4091658	total: 2m 5s	remaining: 8m 55s
285:	learn: 0.4090743	total: 2m 6s	remaining: 8m 55s
286:	learn: 0.4089885	total: 2m 6s	remaining: 8m 54s
287:	learn: 0.4088890	total: 2m 7s	remaining: 8m 54s
288:	learn: 0.4088080	total: 2m 7s	remaining: 8m 54s
289:	learn: 0.4087401	total: 2m 8s	remaining: 8m 54s
290:	learn: 0.4086692	total: 2m 8s	remaining: 8m 54s
291:	learn: 0.4085616	total: 2m 9s	remaining: 8m 54s
292:	learn: 0.4084887	total: 2m 9s	remaining: 8m 54s
293:	learn: 0.4084231	total: 2m 10s	remaining: 8m 53s
294:	learn: 0.4083341	total: 2m 10s	remaining: 8m 53s
295:	learn: 0.4082495	total: 2m 11s	remaining: 8m 53s
296:	learn: 0.4081691	total: 2m 11s	remaining: 8m 53s
297:	learn: 0.4080718	total: 2m 12s	remaining: 8m 52s
298:	learn: 0.4079891	total: 2m 12s	remaining: 8m 52s
299:	learn: 0.4078823	total: 2m 13s	remaining: 8m 52s
300:	learn: 0.4078065	total: 2m 13s	remaining: 8m 51s
301:	learn: 0.4077316	total: 2m 13s	remaining: 8m 51s
302:	learn: 0.4076723	total: 2m 14s	remaining: 8m 50s
303:	learn: 0.4075758	total: 2m 14s	remaining: 8m 50s
304:	learn: 0.4075124	total: 2m 15s	remaining: 8m 50s
305:	learn: 0.4074602	total: 2m 15s	remaining: 8m 49s
306:	learn: 0.4073827	total: 2m 16s	remaining: 8m 49s
307:	learn: 0.4073177	total: 2m 16s	remaining: 8m 48s
308:	learn: 0.4072460	total: 2m 17s	remaining: 8m 48s
309:	learn: 0.4071531	total: 2m 17s	remaining: 8m 48s
310:	learn: 0.4070785	total: 2m 18s	remaining: 8m 47s
311:	learn: 0.4070199	total: 2m 18s	remaining: 8m 47s
312:	learn: 0.4069407	total: 2m 18s	remaining: 8m 46s
313:	learn: 0.4068703	total: 2m 19s	remaining: 8m 46s
314:	learn: 0.4067862	total: 2m 19s	remaining: 8m 45s
315:	learn: 0.4067023	total: 2m 20s	remaining: 8m 45s
316:	learn: 0.4066411	total: 2m 20s	remaining: 8m 45s
317:	learn: 0.4065367	total: 2m 21s	remaining: 8m 44s
318:	learn: 0.4064781	total: 2m 21s	remaining: 8m 44s
319:	learn: 0.4064071	total: 2m 22s	remaining: 8m 44s
320:	learn: 0.4063371	total: 2m 22s	remaining: 8m 44s
321:	learn: 0.4062724	total: 2m 23s	remaining: 8m 43s
322:	learn: 0.4062068	total: 2m 23s	remaining: 8m 42s
323:	learn: 0.4060957	total: 2m 23s	remaining: 8m 42s
324:	learn: 0.4060358	total: 2m 24s	remaining: 8m 41s
325:	learn: 0.4059836	total: 2m 24s	remaining: 8m 41s
326:	learn: 0.4059205	total: 2m 25s	remaining: 8m 40s
327:	learn: 0.4058591	total: 2m 25s	remaining: 8m 40s
328:	learn: 0.4058243	total: 2m 25s	remaining: 8m 39s
329:	learn: 0.4057614	total: 2m 26s	remaining: 8m 38s
330:	learn: 0.4056458	total: 2m 26s	remaining: 8m 38s
331:	learn: 0.4055764	total: 2m 27s	remaining: 8m 37s
332:	learn: 0.4054685	total: 2m 27s	remaining: 8m 37s
333:	learn: 0.4053910	total: 2m 28s	remaining: 8m 36s
334:	learn: 0.4053163	total: 2m 28s	remaining: 8m 36s
335:	learn: 0.4052461	total: 2m 28s	remaining: 8m 35s
336:	learn: 0.4052203	total: 2m 29s	remaining: 8m 35s
337:	learn: 0.4051599	total: 2m 29s	remaining: 8m 34s
338:	learn: 0.4050921	total: 2m 30s	remaining: 8m 34s
339:	learn: 0.4050256	total: 2m 30s	remaining: 8m 33s
340:	learn: 0.4049266	total: 2m 31s	remaining: 8m 33s
341:	learn: 0.4048679	total: 2m 31s	remaining: 8m 32s
342:	learn: 0.4047920	total: 2m 32s	remaining: 8m 32s
343:	learn: 0.4047391	total: 2m 32s	remaining: 8m 32s
344:	learn: 0.4046504	total: 2m 33s	remaining: 8m 32s
345:	learn: 0.4045686	total: 2m 33s	remaining: 8m 31s
346:	learn: 0.4045074	total: 2m 33s	remaining: 8m 31s
347:	learn: 0.4044472	total: 2m 34s	remaining: 8m 30s
348:	learn: 0.4043746	total: 2m 34s	remaining: 8m 30s
349:	learn: 0.4043124	total: 2m 35s	remaining: 8m 29s
350:	learn: 0.4042584	total: 2m 35s	remaining: 8m 28s
351:	learn: 0.4041965	total: 2m 35s	remaining: 8m 28s
352:	learn: 0.4041401	total: 2m 36s	remaining: 8m 27s
353:	learn: 0.4040696	total: 2m 36s	remaining: 8m 27s
354:	learn: 0.4040020	total: 2m 37s	remaining: 8m 26s
355:	learn: 0.4039407	total: 2m 37s	remaining: 8m 25s
356:	learn: 0.4038740	total: 2m 37s	remaining: 8m 25s
357:	learn: 0.4037812	total: 2m 38s	remaining: 8m 24s
358:	learn: 0.4037227	total: 2m 38s	remaining: 8m 24s
359:	learn: 0.4036545	total: 2m 39s	remaining: 8m 23s
360:	learn: 0.4035864	total: 2m 39s	remaining: 8m 23s
361:	learn: 0.4035073	total: 2m 40s	remaining: 8m 22s
362:	learn: 0.4034395	total: 2m 40s	remaining: 8m 22s
363:	learn: 0.4033744	total: 2m 40s	remaining: 8m 21s
364:	learn: 0.4033128	total: 2m 41s	remaining: 8m 21s
365:	learn: 0.4032490	total: 2m 41s	remaining: 8m 20s
366:	learn: 0.4031850	total: 2m 42s	remaining: 8m 20s
367:	learn: 0.4031123	total: 2m 42s	remaining: 8m 19s
368:	learn: 0.4030493	total: 2m 42s	remaining: 8m 19s
369:	learn: 0.4029815	total: 2m 43s	remaining: 8m 18s
370:	learn: 0.4029262	total: 2m 43s	remaining: 8m 17s
371:	learn: 0.4028678	total: 2m 44s	remaining: 8m 17s
372:	learn: 0.4027976	total: 2m 44s	remaining: 8m 16s
373:	learn: 0.4027455	total: 2m 44s	remaining: 8m 16s
374:	learn: 0.4026977	total: 2m 45s	remaining: 8m 15s
375:	learn: 0.4026340	total: 2m 45s	remaining: 8m 14s
376:	learn: 0.4025432	total: 2m 46s	remaining: 8m 14s
377:	learn: 0.4024729	total: 2m 46s	remaining: 8m 14s
378:	learn: 0.4024436	total: 2m 46s	remaining: 8m 13s
379:	learn: 0.4023871	total: 2m 47s	remaining: 8m 12s
380:	learn: 0.4023394	total: 2m 47s	remaining: 8m 12s
381:	learn: 0.4023011	total: 2m 47s	remaining: 8m 11s
382:	learn: 0.4022518	total: 2m 48s	remaining: 8m 11s
383:	learn: 0.4021879	total: 2m 48s	remaining: 8m 10s
384:	learn: 0.4021216	total: 2m 49s	remaining: 8m 10s
385:	learn: 0.4020568	total: 2m 49s	remaining: 8m 9s
386:	learn: 0.4020097	total: 2m 49s	remaining: 8m 8s
387:	learn: 0.4019533	total: 2m 50s	remaining: 8m 8s
388:	learn: 0.4018949	total: 2m 50s	remaining: 8m 7s
389:	learn: 0.4018343	total: 2m 51s	remaining: 8m 7s
390:	learn: 0.4017436	total: 2m 51s	remaining: 8m 6s
391:	learn: 0.4016731	total: 2m 52s	remaining: 8m 6s
392:	learn: 0.4016048	total: 2m 52s	remaining: 8m 6s
393:	learn: 0.4015114	total: 2m 53s	remaining: 8m 5s
394:	learn: 0.4014484	total: 2m 53s	remaining: 8m 5s
395:	learn: 0.4013853	total: 2m 53s	remaining: 8m 4s
396:	learn: 0.4013270	total: 2m 54s	remaining: 8m 4s
397:	learn: 0.4012830	total: 2m 54s	remaining: 8m 3s
398:	learn: 0.4012306	total: 2m 55s	remaining: 8m 2s
399:	learn: 0.4011867	total: 2m 55s	remaining: 8m 2s
400:	learn: 0.4011335	total: 2m 55s	remaining: 8m 1s
401:	learn: 0.4010598	total: 2m 56s	remaining: 8m 1s
402:	learn: 0.4010103	total: 2m 56s	remaining: 8m
403:	learn: 0.4009567	total: 2m 57s	remaining: 8m
404:	learn: 0.4009111	total: 2m 57s	remaining: 7m 59s
405:	learn: 0.4008437	total: 2m 57s	remaining: 7m 59s
406:	learn: 0.4007808	total: 2m 58s	remaining: 7m 59s
407:	learn: 0.4007165	total: 2m 58s	remaining: 7m 58s
408:	learn: 0.4006626	total: 2m 59s	remaining: 7m 57s
409:	learn: 0.4006045	total: 2m 59s	remaining: 7m 57s
410:	learn: 0.4005403	total: 3m	remaining: 7m 56s
411:	learn: 0.4004857	total: 3m	remaining: 7m 56s
412:	learn: 0.4004198	total: 3m	remaining: 7m 56s
413:	learn: 0.4003653	total: 3m 1s	remaining: 7m 55s
414:	learn: 0.4003052	total: 3m 1s	remaining: 7m 55s
415:	learn: 0.4002517	total: 3m 2s	remaining: 7m 54s
416:	learn: 0.4002058	total: 3m 2s	remaining: 7m 54s
417:	learn: 0.4001418	total: 3m 3s	remaining: 7m 54s
418:	learn: 0.4000876	total: 3m 3s	remaining: 7m 54s
419:	learn: 0.4000462	total: 3m 4s	remaining: 7m 53s
420:	learn: 0.3999902	total: 3m 4s	remaining: 7m 53s
421:	learn: 0.3998981	total: 3m 5s	remaining: 7m 53s
422:	learn: 0.3998385	total: 3m 5s	remaining: 7m 53s
423:	learn: 0.3997740	total: 3m 6s	remaining: 7m 52s
424:	learn: 0.3996988	total: 3m 6s	remaining: 7m 52s
425:	learn: 0.3996273	total: 3m 7s	remaining: 7m 51s
426:	learn: 0.3995641	total: 3m 7s	remaining: 7m 51s
427:	learn: 0.3995211	total: 3m 8s	remaining: 7m 50s
428:	learn: 0.3994568	total: 3m 8s	remaining: 7m 50s
429:	learn: 0.3993995	total: 3m 8s	remaining: 7m 50s
430:	learn: 0.3993317	total: 3m 9s	remaining: 7m 49s
431:	learn: 0.3992821	total: 3m 9s	remaining: 7m 49s
432:	learn: 0.3992277	total: 3m 10s	remaining: 7m 48s
433:	learn: 0.3991602	total: 3m 10s	remaining: 7m 48s
434:	learn: 0.3990995	total: 3m 10s	remaining: 7m 47s
435:	learn: 0.3990413	total: 3m 11s	remaining: 7m 47s
436:	learn: 0.3989836	total: 3m 11s	remaining: 7m 46s
437:	learn: 0.3989389	total: 3m 12s	remaining: 7m 46s
438:	learn: 0.3988730	total: 3m 12s	remaining: 7m 45s
439:	learn: 0.3988319	total: 3m 13s	remaining: 7m 45s
440:	learn: 0.3987586	total: 3m 13s	remaining: 7m 44s
441:	learn: 0.3986557	total: 3m 14s	remaining: 7m 44s
442:	learn: 0.3985988	total: 3m 14s	remaining: 7m 43s
443:	learn: 0.3985244	total: 3m 14s	remaining: 7m 43s
444:	learn: 0.3984522	total: 3m 15s	remaining: 7m 43s
445:	learn: 0.3983971	total: 3m 15s	remaining: 7m 42s
446:	learn: 0.3983357	total: 3m 16s	remaining: 7m 41s
447:	learn: 0.3982718	total: 3m 16s	remaining: 7m 41s
448:	learn: 0.3982124	total: 3m 16s	remaining: 7m 40s
449:	learn: 0.3981529	total: 3m 17s	remaining: 7m 40s
450:	learn: 0.3980902	total: 3m 17s	remaining: 7m 40s
451:	learn: 0.3980428	total: 3m 18s	remaining: 7m 39s
452:	learn: 0.3979867	total: 3m 18s	remaining: 7m 39s
453:	learn: 0.3979336	total: 3m 19s	remaining: 7m 38s
454:	learn: 0.3978700	total: 3m 19s	remaining: 7m 38s
455:	learn: 0.3978068	total: 3m 20s	remaining: 7m 38s
456:	learn: 0.3977582	total: 3m 20s	remaining: 7m 37s
457:	learn: 0.3976970	total: 3m 20s	remaining: 7m 37s
458:	learn: 0.3976385	total: 3m 21s	remaining: 7m 36s
459:	learn: 0.3975923	total: 3m 21s	remaining: 7m 36s
460:	learn: 0.3975367	total: 3m 22s	remaining: 7m 35s
461:	learn: 0.3974763	total: 3m 22s	remaining: 7m 34s
462:	learn: 0.3974180	total: 3m 22s	remaining: 7m 34s
463:	learn: 0.3973568	total: 3m 23s	remaining: 7m 33s
464:	learn: 0.3973179	total: 3m 23s	remaining: 7m 33s
465:	learn: 0.3972536	total: 3m 24s	remaining: 7m 32s
466:	learn: 0.3972059	total: 3m 24s	remaining: 7m 32s
467:	learn: 0.3971450	total: 3m 24s	remaining: 7m 31s
468:	learn: 0.3971081	total: 3m 25s	remaining: 7m 31s
469:	learn: 0.3970464	total: 3m 25s	remaining: 7m 30s
470:	learn: 0.3969852	total: 3m 26s	remaining: 7m 30s
471:	learn: 0.3969231	total: 3m 26s	remaining: 7m 29s
472:	learn: 0.3968774	total: 3m 26s	remaining: 7m 29s
473:	learn: 0.3968089	total: 3m 27s	remaining: 7m 29s
474:	learn: 0.3967563	total: 3m 27s	remaining: 7m 28s
475:	learn: 0.3967047	total: 3m 28s	remaining: 7m 28s
476:	learn: 0.3966425	total: 3m 28s	remaining: 7m 27s
477:	learn: 0.3965866	total: 3m 29s	remaining: 7m 27s
478:	learn: 0.3965391	total: 3m 29s	remaining: 7m 26s
479:	learn: 0.3964681	total: 3m 30s	remaining: 7m 26s
480:	learn: 0.3964189	total: 3m 30s	remaining: 7m 25s
481:	learn: 0.3963702	total: 3m 30s	remaining: 7m 25s
482:	learn: 0.3963179	total: 3m 31s	remaining: 7m 24s
483:	learn: 0.3962624	total: 3m 31s	remaining: 7m 24s
484:	learn: 0.3962161	total: 3m 31s	remaining: 7m 23s
485:	learn: 0.3961346	total: 3m 32s	remaining: 7m 22s
486:	learn: 0.3960769	total: 3m 32s	remaining: 7m 22s
487:	learn: 0.3960081	total: 3m 33s	remaining: 7m 22s
488:	learn: 0.3959464	total: 3m 33s	remaining: 7m 21s
489:	learn: 0.3958906	total: 3m 34s	remaining: 7m 21s
490:	learn: 0.3958207	total: 3m 34s	remaining: 7m 21s
491:	learn: 0.3958013	total: 3m 35s	remaining: 7m 20s
492:	learn: 0.3957371	total: 3m 35s	remaining: 7m 20s
493:	learn: 0.3956814	total: 3m 35s	remaining: 7m 19s
494:	learn: 0.3956285	total: 3m 36s	remaining: 7m 19s
495:	learn: 0.3955724	total: 3m 36s	remaining: 7m 18s
496:	learn: 0.3955309	total: 3m 37s	remaining: 7m 18s
497:	learn: 0.3954762	total: 3m 37s	remaining: 7m 17s
498:	learn: 0.3954125	total: 3m 38s	remaining: 7m 17s
499:	learn: 0.3953608	total: 3m 38s	remaining: 7m 17s
500:	learn: 0.3953088	total: 3m 39s	remaining: 7m 16s
501:	learn: 0.3952667	total: 3m 39s	remaining: 7m 16s
502:	learn: 0.3951894	total: 3m 39s	remaining: 7m 15s
503:	learn: 0.3951269	total: 3m 40s	remaining: 7m 15s
504:	learn: 0.3950726	total: 3m 40s	remaining: 7m 15s
505:	learn: 0.3950263	total: 3m 41s	remaining: 7m 14s
506:	learn: 0.3949684	total: 3m 41s	remaining: 7m 14s
507:	learn: 0.3949073	total: 3m 42s	remaining: 7m 13s
508:	learn: 0.3948482	total: 3m 42s	remaining: 7m 13s
509:	learn: 0.3948051	total: 3m 42s	remaining: 7m 12s
510:	learn: 0.3947600	total: 3m 43s	remaining: 7m 12s
511:	learn: 0.3946991	total: 3m 43s	remaining: 7m 11s
512:	learn: 0.3946489	total: 3m 44s	remaining: 7m 11s
513:	learn: 0.3945857	total: 3m 44s	remaining: 7m 10s
514:	learn: 0.3945300	total: 3m 45s	remaining: 7m 10s
515:	learn: 0.3944682	total: 3m 45s	remaining: 7m 9s
516:	learn: 0.3944352	total: 3m 45s	remaining: 7m 9s
517:	learn: 0.3943809	total: 3m 46s	remaining: 7m 8s
518:	learn: 0.3943216	total: 3m 46s	remaining: 7m 8s
519:	learn: 0.3942613	total: 3m 47s	remaining: 7m 7s
520:	learn: 0.3942043	total: 3m 47s	remaining: 7m 7s
521:	learn: 0.3941465	total: 3m 47s	remaining: 7m 7s
522:	learn: 0.3940839	total: 3m 48s	remaining: 7m 6s
523:	learn: 0.3940277	total: 3m 48s	remaining: 7m 6s
524:	learn: 0.3939811	total: 3m 49s	remaining: 7m 5s
525:	learn: 0.3939192	total: 3m 49s	remaining: 7m 5s
526:	learn: 0.3938451	total: 3m 50s	remaining: 7m 4s
527:	learn: 0.3938360	total: 3m 50s	remaining: 7m 4s
528:	learn: 0.3937823	total: 3m 50s	remaining: 7m 3s
529:	learn: 0.3937203	total: 3m 51s	remaining: 7m 3s
530:	learn: 0.3936668	total: 3m 52s	remaining: 7m 3s
531:	learn: 0.3935962	total: 3m 52s	remaining: 7m 2s
532:	learn: 0.3935263	total: 3m 52s	remaining: 7m 2s
533:	learn: 0.3934701	total: 3m 53s	remaining: 7m 2s
534:	learn: 0.3934155	total: 3m 53s	remaining: 7m 1s
535:	learn: 0.3933596	total: 3m 54s	remaining: 7m 1s
536:	learn: 0.3933083	total: 3m 54s	remaining: 7m
537:	learn: 0.3932683	total: 3m 55s	remaining: 7m
538:	learn: 0.3932015	total: 3m 55s	remaining: 6m 59s
539:	learn: 0.3931476	total: 3m 55s	remaining: 6m 59s
540:	learn: 0.3930801	total: 3m 56s	remaining: 6m 59s
541:	learn: 0.3930259	total: 3m 56s	remaining: 6m 58s
542:	learn: 0.3929888	total: 3m 57s	remaining: 6m 57s
543:	learn: 0.3929435	total: 3m 57s	remaining: 6m 57s
544:	learn: 0.3928837	total: 3m 57s	remaining: 6m 57s
545:	learn: 0.3928245	total: 3m 58s	remaining: 6m 56s
546:	learn: 0.3927695	total: 3m 58s	remaining: 6m 56s
547:	learn: 0.3927201	total: 3m 59s	remaining: 6m 55s
548:	learn: 0.3926772	total: 3m 59s	remaining: 6m 55s
549:	learn: 0.3926242	total: 4m	remaining: 6m 54s
550:	learn: 0.3925872	total: 4m	remaining: 6m 54s
551:	learn: 0.3925434	total: 4m	remaining: 6m 53s
552:	learn: 0.3924931	total: 4m 1s	remaining: 6m 53s
553:	learn: 0.3924356	total: 4m 1s	remaining: 6m 52s
554:	learn: 0.3923807	total: 4m 2s	remaining: 6m 52s
555:	learn: 0.3923201	total: 4m 2s	remaining: 6m 51s
556:	learn: 0.3922589	total: 4m 3s	remaining: 6m 51s
557:	learn: 0.3922051	total: 4m 3s	remaining: 6m 51s
558:	learn: 0.3921541	total: 4m 3s	remaining: 6m 50s
559:	learn: 0.3921058	total: 4m 4s	remaining: 6m 49s
560:	learn: 0.3920446	total: 4m 4s	remaining: 6m 49s
561:	learn: 0.3919976	total: 4m 5s	remaining: 6m 48s
562:	learn: 0.3919442	total: 4m 5s	remaining: 6m 48s
563:	learn: 0.3918968	total: 4m 5s	remaining: 6m 48s
564:	learn: 0.3918394	total: 4m 6s	remaining: 6m 47s
565:	learn: 0.3917869	total: 4m 6s	remaining: 6m 47s
566:	learn: 0.3917463	total: 4m 7s	remaining: 6m 46s
567:	learn: 0.3916944	total: 4m 7s	remaining: 6m 46s
568:	learn: 0.3916323	total: 4m 8s	remaining: 6m 45s
569:	learn: 0.3915804	total: 4m 8s	remaining: 6m 45s
570:	learn: 0.3915341	total: 4m 9s	remaining: 6m 45s
571:	learn: 0.3914864	total: 4m 9s	remaining: 6m 44s
572:	learn: 0.3914342	total: 4m 9s	remaining: 6m 44s
573:	learn: 0.3913655	total: 4m 10s	remaining: 6m 43s
574:	learn: 0.3913021	total: 4m 10s	remaining: 6m 43s
575:	learn: 0.3912648	total: 4m 11s	remaining: 6m 42s
576:	learn: 0.3912187	total: 4m 11s	remaining: 6m 42s
577:	learn: 0.3911746	total: 4m 11s	remaining: 6m 41s
578:	learn: 0.3911233	total: 4m 12s	remaining: 6m 41s
579:	learn: 0.3910832	total: 4m 12s	remaining: 6m 40s
580:	learn: 0.3910290	total: 4m 13s	remaining: 6m 40s
581:	learn: 0.3909918	total: 4m 13s	remaining: 6m 39s
582:	learn: 0.3909876	total: 4m 13s	remaining: 6m 39s
583:	learn: 0.3909475	total: 4m 14s	remaining: 6m 38s
584:	learn: 0.3908919	total: 4m 14s	remaining: 6m 38s
585:	learn: 0.3908480	total: 4m 14s	remaining: 6m 37s
586:	learn: 0.3907958	total: 4m 15s	remaining: 6m 37s
587:	learn: 0.3907485	total: 4m 15s	remaining: 6m 36s
588:	learn: 0.3906877	total: 4m 16s	remaining: 6m 36s
589:	learn: 0.3906227	total: 4m 16s	remaining: 6m 36s
590:	learn: 0.3905806	total: 4m 17s	remaining: 6m 35s
591:	learn: 0.3905289	total: 4m 17s	remaining: 6m 35s
592:	learn: 0.3904891	total: 4m 17s	remaining: 6m 34s
593:	learn: 0.3904523	total: 4m 18s	remaining: 6m 34s
594:	learn: 0.3904064	total: 4m 18s	remaining: 6m 33s
595:	learn: 0.3903663	total: 4m 19s	remaining: 6m 33s
596:	learn: 0.3903270	total: 4m 19s	remaining: 6m 32s
597:	learn: 0.3902830	total: 4m 19s	remaining: 6m 32s
598:	learn: 0.3902483	total: 4m 20s	remaining: 6m 31s
599:	learn: 0.3901997	total: 4m 20s	remaining: 6m 31s
600:	learn: 0.3901982	total: 4m 20s	remaining: 6m 30s
601:	learn: 0.3901356	total: 4m 21s	remaining: 6m 29s
602:	learn: 0.3900953	total: 4m 21s	remaining: 6m 29s
603:	learn: 0.3900638	total: 4m 22s	remaining: 6m 28s
604:	learn: 0.3900242	total: 4m 22s	remaining: 6m 28s
605:	learn: 0.3899648	total: 4m 23s	remaining: 6m 28s
606:	learn: 0.3899223	total: 4m 23s	remaining: 6m 27s
607:	learn: 0.3898591	total: 4m 24s	remaining: 6m 27s
608:	learn: 0.3897972	total: 4m 24s	remaining: 6m 27s
609:	learn: 0.3897538	total: 4m 25s	remaining: 6m 26s
610:	learn: 0.3897119	total: 4m 25s	remaining: 6m 26s
611:	learn: 0.3896663	total: 4m 25s	remaining: 6m 25s
612:	learn: 0.3896176	total: 4m 26s	remaining: 6m 25s
613:	learn: 0.3895583	total: 4m 26s	remaining: 6m 24s
614:	learn: 0.3895095	total: 4m 27s	remaining: 6m 24s
615:	learn: 0.3894500	total: 4m 27s	remaining: 6m 24s
616:	learn: 0.3894055	total: 4m 28s	remaining: 6m 23s
617:	learn: 0.3893559	total: 4m 28s	remaining: 6m 23s
618:	learn: 0.3893242	total: 4m 29s	remaining: 6m 23s
619:	learn: 0.3892753	total: 4m 29s	remaining: 6m 22s
620:	learn: 0.3892306	total: 4m 30s	remaining: 6m 22s
621:	learn: 0.3891813	total: 4m 30s	remaining: 6m 21s
622:	learn: 0.3891348	total: 4m 30s	remaining: 6m 21s
623:	learn: 0.3890774	total: 4m 31s	remaining: 6m 20s
624:	learn: 0.3890275	total: 4m 31s	remaining: 6m 20s
625:	learn: 0.3889754	total: 4m 32s	remaining: 6m 20s
626:	learn: 0.3889686	total: 4m 32s	remaining: 6m 19s
627:	learn: 0.3889187	total: 4m 32s	remaining: 6m 18s
628:	learn: 0.3888695	total: 4m 33s	remaining: 6m 18s
629:	learn: 0.3888172	total: 4m 33s	remaining: 6m 18s
630:	learn: 0.3887693	total: 4m 34s	remaining: 6m 17s
631:	learn: 0.3887205	total: 4m 34s	remaining: 6m 17s
632:	learn: 0.3886828	total: 4m 35s	remaining: 6m 16s
633:	learn: 0.3886480	total: 4m 35s	remaining: 6m 16s
634:	learn: 0.3886095	total: 4m 35s	remaining: 6m 15s
635:	learn: 0.3885720	total: 4m 36s	remaining: 6m 15s
636:	learn: 0.3885285	total: 4m 36s	remaining: 6m 14s
637:	learn: 0.3884723	total: 4m 37s	remaining: 6m 14s
638:	learn: 0.3884247	total: 4m 37s	remaining: 6m 13s
639:	learn: 0.3883887	total: 4m 37s	remaining: 6m 13s
640:	learn: 0.3883357	total: 4m 38s	remaining: 6m 13s
641:	learn: 0.3882922	total: 4m 39s	remaining: 6m 12s
642:	learn: 0.3882434	total: 4m 39s	remaining: 6m 12s
643:	learn: 0.3882018	total: 4m 40s	remaining: 6m 12s
644:	learn: 0.3881397	total: 4m 40s	remaining: 6m 12s
645:	learn: 0.3880886	total: 4m 41s	remaining: 6m 11s
646:	learn: 0.3880383	total: 4m 41s	remaining: 6m 11s
647:	learn: 0.3879897	total: 4m 42s	remaining: 6m 10s
648:	learn: 0.3879337	total: 4m 42s	remaining: 6m 10s
649:	learn: 0.3878738	total: 4m 43s	remaining: 6m 10s
650:	learn: 0.3878174	total: 4m 43s	remaining: 6m 9s
651:	learn: 0.3877714	total: 4m 43s	remaining: 6m 9s
652:	learn: 0.3877126	total: 4m 44s	remaining: 6m 8s
653:	learn: 0.3876677	total: 4m 44s	remaining: 6m 8s
654:	learn: 0.3876294	total: 4m 45s	remaining: 6m 7s
655:	learn: 0.3875816	total: 4m 45s	remaining: 6m 7s
656:	learn: 0.3875502	total: 4m 46s	remaining: 6m 7s
657:	learn: 0.3875018	total: 4m 46s	remaining: 6m 6s
658:	learn: 0.3874551	total: 4m 47s	remaining: 6m 6s
659:	learn: 0.3873996	total: 4m 47s	remaining: 6m 6s
660:	learn: 0.3873587	total: 4m 48s	remaining: 6m 5s
661:	learn: 0.3873168	total: 4m 48s	remaining: 6m 5s
662:	learn: 0.3872601	total: 4m 49s	remaining: 6m 5s
663:	learn: 0.3872079	total: 4m 49s	remaining: 6m 4s
664:	learn: 0.3871511	total: 4m 50s	remaining: 6m 4s
665:	learn: 0.3870789	total: 4m 50s	remaining: 6m 4s
666:	learn: 0.3870512	total: 4m 51s	remaining: 6m 3s
667:	learn: 0.3869966	total: 4m 51s	remaining: 6m 3s
668:	learn: 0.3869525	total: 4m 52s	remaining: 6m 3s
669:	learn: 0.3869097	total: 4m 52s	remaining: 6m 2s
670:	learn: 0.3868568	total: 4m 53s	remaining: 6m 2s
671:	learn: 0.3868016	total: 4m 54s	remaining: 6m 2s
672:	learn: 0.3867503	total: 4m 54s	remaining: 6m 2s
673:	learn: 0.3867194	total: 4m 55s	remaining: 6m 1s
674:	learn: 0.3866761	total: 4m 55s	remaining: 6m 1s
675:	learn: 0.3866467	total: 4m 56s	remaining: 6m 1s
676:	learn: 0.3866100	total: 4m 57s	remaining: 6m 1s
677:	learn: 0.3865604	total: 4m 57s	remaining: 6m
678:	learn: 0.3865125	total: 4m 58s	remaining: 6m
679:	learn: 0.3864785	total: 4m 58s	remaining: 6m
680:	learn: 0.3864311	total: 4m 58s	remaining: 5m 59s
681:	learn: 0.3863975	total: 4m 59s	remaining: 5m 58s
682:	learn: 0.3863428	total: 4m 59s	remaining: 5m 58s
683:	learn: 0.3862917	total: 5m	remaining: 5m 58s
684:	learn: 0.3862294	total: 5m	remaining: 5m 57s
685:	learn: 0.3861708	total: 5m 1s	remaining: 5m 57s
686:	learn: 0.3861155	total: 5m 1s	remaining: 5m 57s
687:	learn: 0.3860605	total: 5m 2s	remaining: 5m 56s
688:	learn: 0.3860031	total: 5m 2s	remaining: 5m 56s
689:	learn: 0.3859586	total: 5m 3s	remaining: 5m 56s
690:	learn: 0.3859230	total: 5m 3s	remaining: 5m 55s
691:	learn: 0.3858656	total: 5m 4s	remaining: 5m 55s
692:	learn: 0.3858177	total: 5m 4s	remaining: 5m 54s
693:	learn: 0.3857681	total: 5m 5s	remaining: 5m 54s
694:	learn: 0.3857271	total: 5m 5s	remaining: 5m 54s
695:	learn: 0.3856816	total: 5m 6s	remaining: 5m 53s
696:	learn: 0.3856485	total: 5m 6s	remaining: 5m 53s
697:	learn: 0.3856057	total: 5m 7s	remaining: 5m 52s
698:	learn: 0.3856041	total: 5m 7s	remaining: 5m 52s
699:	learn: 0.3855585	total: 5m 7s	remaining: 5m 51s
700:	learn: 0.3855156	total: 5m 8s	remaining: 5m 51s
701:	learn: 0.3854761	total: 5m 8s	remaining: 5m 50s
702:	learn: 0.3854317	total: 5m 9s	remaining: 5m 50s
703:	learn: 0.3853922	total: 5m 9s	remaining: 5m 49s
704:	learn: 0.3853431	total: 5m 9s	remaining: 5m 49s
705:	learn: 0.3852945	total: 5m 10s	remaining: 5m 49s
706:	learn: 0.3852578	total: 5m 10s	remaining: 5m 48s
707:	learn: 0.3852179	total: 5m 11s	remaining: 5m 48s
708:	learn: 0.3851754	total: 5m 11s	remaining: 5m 47s
709:	learn: 0.3851202	total: 5m 12s	remaining: 5m 47s
710:	learn: 0.3850777	total: 5m 12s	remaining: 5m 46s
711:	learn: 0.3850297	total: 5m 13s	remaining: 5m 46s
712:	learn: 0.3849944	total: 5m 13s	remaining: 5m 45s
713:	learn: 0.3849454	total: 5m 13s	remaining: 5m 45s
714:	learn: 0.3849099	total: 5m 14s	remaining: 5m 44s
715:	learn: 0.3848639	total: 5m 14s	remaining: 5m 44s
716:	learn: 0.3848191	total: 5m 14s	remaining: 5m 43s
717:	learn: 0.3847898	total: 5m 15s	remaining: 5m 43s
718:	learn: 0.3847419	total: 5m 15s	remaining: 5m 43s
719:	learn: 0.3846915	total: 5m 16s	remaining: 5m 42s
720:	learn: 0.3846409	total: 5m 16s	remaining: 5m 42s
721:	learn: 0.3845957	total: 5m 17s	remaining: 5m 41s
722:	learn: 0.3845379	total: 5m 17s	remaining: 5m 41s
723:	learn: 0.3844805	total: 5m 18s	remaining: 5m 40s
724:	learn: 0.3844376	total: 5m 18s	remaining: 5m 40s
725:	learn: 0.3843939	total: 5m 18s	remaining: 5m 40s
726:	learn: 0.3843461	total: 5m 19s	remaining: 5m 39s
727:	learn: 0.3842918	total: 5m 19s	remaining: 5m 39s
728:	learn: 0.3842528	total: 5m 20s	remaining: 5m 38s
729:	learn: 0.3842109	total: 5m 20s	remaining: 5m 38s
730:	learn: 0.3841634	total: 5m 20s	remaining: 5m 37s
731:	learn: 0.3841189	total: 5m 21s	remaining: 5m 37s
732:	learn: 0.3840671	total: 5m 21s	remaining: 5m 36s
733:	learn: 0.3840270	total: 5m 22s	remaining: 5m 36s
734:	learn: 0.3839746	total: 5m 22s	remaining: 5m 35s
735:	learn: 0.3839299	total: 5m 22s	remaining: 5m 35s
736:	learn: 0.3839019	total: 5m 23s	remaining: 5m 34s
737:	learn: 0.3838603	total: 5m 23s	remaining: 5m 34s
738:	learn: 0.3838181	total: 5m 24s	remaining: 5m 33s
739:	learn: 0.3837649	total: 5m 24s	remaining: 5m 33s
740:	learn: 0.3837349	total: 5m 24s	remaining: 5m 32s
741:	learn: 0.3836695	total: 5m 25s	remaining: 5m 32s
742:	learn: 0.3836126	total: 5m 25s	remaining: 5m 32s
743:	learn: 0.3835717	total: 5m 26s	remaining: 5m 31s
744:	learn: 0.3835239	total: 5m 26s	remaining: 5m 31s
745:	learn: 0.3834826	total: 5m 27s	remaining: 5m 30s
746:	learn: 0.3834469	total: 5m 27s	remaining: 5m 30s
747:	learn: 0.3834096	total: 5m 27s	remaining: 5m 29s
748:	learn: 0.3833627	total: 5m 28s	remaining: 5m 29s
749:	learn: 0.3833291	total: 5m 28s	remaining: 5m 28s
750:	learn: 0.3832857	total: 5m 29s	remaining: 5m 28s
751:	learn: 0.3832327	total: 5m 29s	remaining: 5m 27s
752:	learn: 0.3831905	total: 5m 30s	remaining: 5m 27s
753:	learn: 0.3831436	total: 5m 30s	remaining: 5m 27s
754:	learn: 0.3830973	total: 5m 31s	remaining: 5m 26s
755:	learn: 0.3830592	total: 5m 31s	remaining: 5m 26s
756:	learn: 0.3830036	total: 5m 31s	remaining: 5m 25s
757:	learn: 0.3829625	total: 5m 32s	remaining: 5m 25s
758:	learn: 0.3829170	total: 5m 32s	remaining: 5m 24s
759:	learn: 0.3828615	total: 5m 33s	remaining: 5m 24s
760:	learn: 0.3828132	total: 5m 33s	remaining: 5m 24s
761:	learn: 0.3827819	total: 5m 34s	remaining: 5m 23s
762:	learn: 0.3827537	total: 5m 34s	remaining: 5m 23s
763:	learn: 0.3827204	total: 5m 34s	remaining: 5m 22s
764:	learn: 0.3827133	total: 5m 35s	remaining: 5m 22s
765:	learn: 0.3826678	total: 5m 35s	remaining: 5m 21s
766:	learn: 0.3826337	total: 5m 36s	remaining: 5m 21s
767:	learn: 0.3825764	total: 5m 36s	remaining: 5m 20s
768:	learn: 0.3825358	total: 5m 36s	remaining: 5m 20s
769:	learn: 0.3824925	total: 5m 37s	remaining: 5m 19s
770:	learn: 0.3824452	total: 5m 37s	remaining: 5m 19s
771:	learn: 0.3823954	total: 5m 38s	remaining: 5m 19s
772:	learn: 0.3823539	total: 5m 38s	remaining: 5m 18s
773:	learn: 0.3823093	total: 5m 39s	remaining: 5m 18s
774:	learn: 0.3822597	total: 5m 39s	remaining: 5m 17s
775:	learn: 0.3822200	total: 5m 39s	remaining: 5m 17s
776:	learn: 0.3821654	total: 5m 40s	remaining: 5m 16s
777:	learn: 0.3821123	total: 5m 40s	remaining: 5m 16s
778:	learn: 0.3820696	total: 5m 41s	remaining: 5m 15s
779:	learn: 0.3820680	total: 5m 41s	remaining: 5m 15s
780:	learn: 0.3820213	total: 5m 42s	remaining: 5m 14s
781:	learn: 0.3819759	total: 5m 42s	remaining: 5m 14s
782:	learn: 0.3819441	total: 5m 42s	remaining: 5m 14s
783:	learn: 0.3819080	total: 5m 43s	remaining: 5m 13s
784:	learn: 0.3818708	total: 5m 43s	remaining: 5m 13s
785:	learn: 0.3818164	total: 5m 44s	remaining: 5m 12s
786:	learn: 0.3818137	total: 5m 44s	remaining: 5m 12s
787:	learn: 0.3817762	total: 5m 45s	remaining: 5m 11s
788:	learn: 0.3817400	total: 5m 45s	remaining: 5m 11s
789:	learn: 0.3816910	total: 5m 46s	remaining: 5m 10s
790:	learn: 0.3816486	total: 5m 46s	remaining: 5m 10s
791:	learn: 0.3816424	total: 5m 46s	remaining: 5m 9s
792:	learn: 0.3816025	total: 5m 47s	remaining: 5m 9s
793:	learn: 0.3815717	total: 5m 47s	remaining: 5m 8s
794:	learn: 0.3815378	total: 5m 47s	remaining: 5m 8s
795:	learn: 0.3815058	total: 5m 48s	remaining: 5m 7s
796:	learn: 0.3814574	total: 5m 48s	remaining: 5m 7s
797:	learn: 0.3813992	total: 5m 49s	remaining: 5m 7s
798:	learn: 0.3813533	total: 5m 49s	remaining: 5m 6s
799:	learn: 0.3813033	total: 5m 49s	remaining: 5m 6s
800:	learn: 0.3812544	total: 5m 50s	remaining: 5m 5s
801:	learn: 0.3812145	total: 5m 50s	remaining: 5m 5s
802:	learn: 0.3811668	total: 5m 51s	remaining: 5m 4s
803:	learn: 0.3811325	total: 5m 51s	remaining: 5m 4s
804:	learn: 0.3810788	total: 5m 51s	remaining: 5m 3s
805:	learn: 0.3810429	total: 5m 52s	remaining: 5m 3s
806:	learn: 0.3809873	total: 5m 52s	remaining: 5m 2s
807:	learn: 0.3809393	total: 5m 53s	remaining: 5m 2s
808:	learn: 0.3809013	total: 5m 53s	remaining: 5m 1s
809:	learn: 0.3808625	total: 5m 53s	remaining: 5m 1s
810:	learn: 0.3808327	total: 5m 54s	remaining: 5m 1s
811:	learn: 0.3807948	total: 5m 54s	remaining: 5m
812:	learn: 0.3807427	total: 5m 55s	remaining: 5m
813:	learn: 0.3806963	total: 5m 55s	remaining: 4m 59s
814:	learn: 0.3806571	total: 5m 55s	remaining: 4m 59s
815:	learn: 0.3806153	total: 5m 56s	remaining: 4m 58s
816:	learn: 0.3805747	total: 5m 56s	remaining: 4m 58s
817:	learn: 0.3805349	total: 5m 57s	remaining: 4m 57s
818:	learn: 0.3804762	total: 5m 57s	remaining: 4m 57s
819:	learn: 0.3804436	total: 5m 57s	remaining: 4m 56s
820:	learn: 0.3803959	total: 5m 58s	remaining: 4m 56s
821:	learn: 0.3803505	total: 5m 58s	remaining: 4m 55s
822:	learn: 0.3803115	total: 5m 59s	remaining: 4m 55s
823:	learn: 0.3802564	total: 5m 59s	remaining: 4m 55s
824:	learn: 0.3802265	total: 6m	remaining: 4m 54s
825:	learn: 0.3801588	total: 6m	remaining: 4m 54s
826:	learn: 0.3801233	total: 6m 1s	remaining: 4m 53s
827:	learn: 0.3800651	total: 6m 1s	remaining: 4m 53s
828:	learn: 0.3800197	total: 6m 2s	remaining: 4m 53s
829:	learn: 0.3799775	total: 6m 2s	remaining: 4m 52s
830:	learn: 0.3799406	total: 6m 2s	remaining: 4m 52s
831:	learn: 0.3799078	total: 6m 3s	remaining: 4m 51s
832:	learn: 0.3798442	total: 6m 3s	remaining: 4m 51s
833:	learn: 0.3797972	total: 6m 4s	remaining: 4m 50s
834:	learn: 0.3797564	total: 6m 4s	remaining: 4m 50s
835:	learn: 0.3797082	total: 6m 4s	remaining: 4m 49s
836:	learn: 0.3796648	total: 6m 5s	remaining: 4m 49s
837:	learn: 0.3796278	total: 6m 5s	remaining: 4m 48s
838:	learn: 0.3795849	total: 6m 6s	remaining: 4m 48s
839:	learn: 0.3795440	total: 6m 6s	remaining: 4m 47s
840:	learn: 0.3794953	total: 6m 6s	remaining: 4m 47s
841:	learn: 0.3794944	total: 6m 7s	remaining: 4m 46s
842:	learn: 0.3794522	total: 6m 7s	remaining: 4m 46s
843:	learn: 0.3794062	total: 6m 7s	remaining: 4m 45s
844:	learn: 0.3793624	total: 6m 8s	remaining: 4m 45s
845:	learn: 0.3793167	total: 6m 8s	remaining: 4m 45s
846:	learn: 0.3792657	total: 6m 9s	remaining: 4m 44s
847:	learn: 0.3792257	total: 6m 9s	remaining: 4m 44s
848:	learn: 0.3791847	total: 6m 9s	remaining: 4m 43s
849:	learn: 0.3791421	total: 6m 10s	remaining: 4m 43s
850:	learn: 0.3791021	total: 6m 10s	remaining: 4m 42s
851:	learn: 0.3790544	total: 6m 11s	remaining: 4m 42s
852:	learn: 0.3790000	total: 6m 11s	remaining: 4m 41s
853:	learn: 0.3789502	total: 6m 12s	remaining: 4m 41s
854:	learn: 0.3789073	total: 6m 12s	remaining: 4m 41s
855:	learn: 0.3788680	total: 6m 13s	remaining: 4m 40s
856:	learn: 0.3788218	total: 6m 13s	remaining: 4m 40s
857:	learn: 0.3787719	total: 6m 13s	remaining: 4m 39s
858:	learn: 0.3787197	total: 6m 14s	remaining: 4m 39s
859:	learn: 0.3786765	total: 6m 14s	remaining: 4m 38s
860:	learn: 0.3786384	total: 6m 15s	remaining: 4m 38s
861:	learn: 0.3785969	total: 6m 15s	remaining: 4m 38s
862:	learn: 0.3785578	total: 6m 16s	remaining: 4m 37s
863:	learn: 0.3785034	total: 6m 16s	remaining: 4m 37s
864:	learn: 0.3784713	total: 6m 16s	remaining: 4m 36s
865:	learn: 0.3784221	total: 6m 17s	remaining: 4m 36s
866:	learn: 0.3783718	total: 6m 17s	remaining: 4m 35s
867:	learn: 0.3783274	total: 6m 18s	remaining: 4m 35s
868:	learn: 0.3782738	total: 6m 18s	remaining: 4m 34s
869:	learn: 0.3782330	total: 6m 18s	remaining: 4m 34s
870:	learn: 0.3781986	total: 6m 19s	remaining: 4m 33s
871:	learn: 0.3781576	total: 6m 19s	remaining: 4m 33s
872:	learn: 0.3781068	total: 6m 20s	remaining: 4m 33s
873:	learn: 0.3780785	total: 6m 20s	remaining: 4m 32s
874:	learn: 0.3780376	total: 6m 20s	remaining: 4m 32s
875:	learn: 0.3779951	total: 6m 21s	remaining: 4m 31s
876:	learn: 0.3779431	total: 6m 21s	remaining: 4m 31s
877:	learn: 0.3779063	total: 6m 22s	remaining: 4m 30s
878:	learn: 0.3778695	total: 6m 22s	remaining: 4m 30s
879:	learn: 0.3778222	total: 6m 22s	remaining: 4m 29s
880:	learn: 0.3777676	total: 6m 23s	remaining: 4m 29s
881:	learn: 0.3777358	total: 6m 23s	remaining: 4m 28s
882:	learn: 0.3776911	total: 6m 23s	remaining: 4m 28s
883:	learn: 0.3776591	total: 6m 24s	remaining: 4m 27s
884:	learn: 0.3776231	total: 6m 24s	remaining: 4m 27s
885:	learn: 0.3775899	total: 6m 24s	remaining: 4m 26s
886:	learn: 0.3775351	total: 6m 25s	remaining: 4m 26s
887:	learn: 0.3774859	total: 6m 25s	remaining: 4m 25s
888:	learn: 0.3774361	total: 6m 26s	remaining: 4m 25s
889:	learn: 0.3773900	total: 6m 26s	remaining: 4m 24s
890:	learn: 0.3773479	total: 6m 26s	remaining: 4m 24s
891:	learn: 0.3773002	total: 6m 27s	remaining: 4m 23s
892:	learn: 0.3772595	total: 6m 27s	remaining: 4m 23s
893:	learn: 0.3772215	total: 6m 28s	remaining: 4m 23s
894:	learn: 0.3771888	total: 6m 28s	remaining: 4m 22s
895:	learn: 0.3771310	total: 6m 28s	remaining: 4m 22s
896:	learn: 0.3771121	total: 6m 29s	remaining: 4m 21s
897:	learn: 0.3770771	total: 6m 29s	remaining: 4m 21s
898:	learn: 0.3770331	total: 6m 29s	remaining: 4m 20s
899:	learn: 0.3769841	total: 6m 30s	remaining: 4m 20s
900:	learn: 0.3769395	total: 6m 30s	remaining: 4m 19s
901:	learn: 0.3769065	total: 6m 31s	remaining: 4m 19s
902:	learn: 0.3768523	total: 6m 31s	remaining: 4m 19s
903:	learn: 0.3768073	total: 6m 32s	remaining: 4m 18s
904:	learn: 0.3767668	total: 6m 32s	remaining: 4m 18s
905:	learn: 0.3767348	total: 6m 32s	remaining: 4m 17s
906:	learn: 0.3766908	total: 6m 33s	remaining: 4m 17s
907:	learn: 0.3766586	total: 6m 33s	remaining: 4m 16s
908:	learn: 0.3766183	total: 6m 34s	remaining: 4m 16s
909:	learn: 0.3765818	total: 6m 34s	remaining: 4m 15s
910:	learn: 0.3765436	total: 6m 34s	remaining: 4m 15s
911:	learn: 0.3765080	total: 6m 35s	remaining: 4m 14s
912:	learn: 0.3764624	total: 6m 35s	remaining: 4m 14s
913:	learn: 0.3764160	total: 6m 36s	remaining: 4m 13s
914:	learn: 0.3763706	total: 6m 36s	remaining: 4m 13s
915:	learn: 0.3763186	total: 6m 36s	remaining: 4m 13s
916:	learn: 0.3762962	total: 6m 37s	remaining: 4m 12s
917:	learn: 0.3762518	total: 6m 37s	remaining: 4m 12s
918:	learn: 0.3762069	total: 6m 38s	remaining: 4m 11s
919:	learn: 0.3761733	total: 6m 38s	remaining: 4m 11s
920:	learn: 0.3761262	total: 6m 38s	remaining: 4m 10s
921:	learn: 0.3760769	total: 6m 39s	remaining: 4m 10s
922:	learn: 0.3760434	total: 6m 39s	remaining: 4m 9s
923:	learn: 0.3759945	total: 6m 40s	remaining: 4m 9s
924:	learn: 0.3759528	total: 6m 40s	remaining: 4m 8s
925:	learn: 0.3759102	total: 6m 40s	remaining: 4m 8s
926:	learn: 0.3758647	total: 6m 41s	remaining: 4m 8s
927:	learn: 0.3758387	total: 6m 41s	remaining: 4m 7s
928:	learn: 0.3758056	total: 6m 42s	remaining: 4m 7s
929:	learn: 0.3757601	total: 6m 42s	remaining: 4m 6s
930:	learn: 0.3757140	total: 6m 42s	remaining: 4m 6s
931:	learn: 0.3756711	total: 6m 43s	remaining: 4m 5s
932:	learn: 0.3756244	total: 6m 43s	remaining: 4m 5s
933:	learn: 0.3755873	total: 6m 44s	remaining: 4m 4s
934:	learn: 0.3755357	total: 6m 44s	remaining: 4m 4s
935:	learn: 0.3754891	total: 6m 44s	remaining: 4m 4s
936:	learn: 0.3754521	total: 6m 45s	remaining: 4m 3s
937:	learn: 0.3754071	total: 6m 45s	remaining: 4m 3s
938:	learn: 0.3753557	total: 6m 46s	remaining: 4m 2s
939:	learn: 0.3753350	total: 6m 46s	remaining: 4m 2s
940:	learn: 0.3752909	total: 6m 47s	remaining: 4m 1s
941:	learn: 0.3752401	total: 6m 47s	remaining: 4m 1s
942:	learn: 0.3752023	total: 6m 48s	remaining: 4m 1s
943:	learn: 0.3751684	total: 6m 48s	remaining: 4m
944:	learn: 0.3751443	total: 6m 48s	remaining: 4m
945:	learn: 0.3750933	total: 6m 49s	remaining: 3m 59s
946:	learn: 0.3750497	total: 6m 49s	remaining: 3m 59s
947:	learn: 0.3750095	total: 6m 50s	remaining: 3m 58s
948:	learn: 0.3749773	total: 6m 50s	remaining: 3m 58s
949:	learn: 0.3749321	total: 6m 50s	remaining: 3m 57s
950:	learn: 0.3748876	total: 6m 51s	remaining: 3m 57s
951:	learn: 0.3748535	total: 6m 51s	remaining: 3m 56s
952:	learn: 0.3748100	total: 6m 52s	remaining: 3m 56s
953:	learn: 0.3747762	total: 6m 52s	remaining: 3m 56s
954:	learn: 0.3747342	total: 6m 52s	remaining: 3m 55s
955:	learn: 0.3746862	total: 6m 53s	remaining: 3m 55s
956:	learn: 0.3746367	total: 6m 53s	remaining: 3m 54s
957:	learn: 0.3745867	total: 6m 53s	remaining: 3m 54s
958:	learn: 0.3745539	total: 6m 54s	remaining: 3m 53s
959:	learn: 0.3745112	total: 6m 54s	remaining: 3m 53s
960:	learn: 0.3744599	total: 6m 55s	remaining: 3m 52s
961:	learn: 0.3744139	total: 6m 55s	remaining: 3m 52s
962:	learn: 0.3743728	total: 6m 55s	remaining: 3m 51s
963:	learn: 0.3743341	total: 6m 56s	remaining: 3m 51s
964:	learn: 0.3742873	total: 6m 56s	remaining: 3m 50s
965:	learn: 0.3742413	total: 6m 57s	remaining: 3m 50s
966:	learn: 0.3742065	total: 6m 57s	remaining: 3m 50s
967:	learn: 0.3741634	total: 6m 57s	remaining: 3m 49s
968:	learn: 0.3741155	total: 6m 58s	remaining: 3m 49s
969:	learn: 0.3740658	total: 6m 58s	remaining: 3m 48s
970:	learn: 0.3740199	total: 6m 59s	remaining: 3m 48s
971:	learn: 0.3739834	total: 6m 59s	remaining: 3m 47s
972:	learn: 0.3739342	total: 6m 59s	remaining: 3m 47s
973:	learn: 0.3738998	total: 7m	remaining: 3m 46s
974:	learn: 0.3738593	total: 7m	remaining: 3m 46s
975:	learn: 0.3738131	total: 7m 1s	remaining: 3m 46s
976:	learn: 0.3737799	total: 7m 1s	remaining: 3m 45s
977:	learn: 0.3737360	total: 7m 2s	remaining: 3m 45s
978:	learn: 0.3736874	total: 7m 2s	remaining: 3m 44s
979:	learn: 0.3736617	total: 7m 3s	remaining: 3m 44s
980:	learn: 0.3736145	total: 7m 3s	remaining: 3m 44s
981:	learn: 0.3735725	total: 7m 4s	remaining: 3m 43s
982:	learn: 0.3735313	total: 7m 4s	remaining: 3m 43s
983:	learn: 0.3734955	total: 7m 5s	remaining: 3m 42s
984:	learn: 0.3734520	total: 7m 5s	remaining: 3m 42s
985:	learn: 0.3734056	total: 7m 6s	remaining: 3m 42s
986:	learn: 0.3733645	total: 7m 6s	remaining: 3m 41s
987:	learn: 0.3733364	total: 7m 6s	remaining: 3m 41s
988:	learn: 0.3733062	total: 7m 7s	remaining: 3m 40s
989:	learn: 0.3732610	total: 7m 7s	remaining: 3m 40s
990:	learn: 0.3732134	total: 7m 8s	remaining: 3m 39s
991:	learn: 0.3731697	total: 7m 8s	remaining: 3m 39s
992:	learn: 0.3731298	total: 7m 8s	remaining: 3m 38s
993:	learn: 0.3730944	total: 7m 9s	remaining: 3m 38s
994:	learn: 0.3730444	total: 7m 9s	remaining: 3m 38s
995:	learn: 0.3730105	total: 7m 10s	remaining: 3m 37s
996:	learn: 0.3729790	total: 7m 10s	remaining: 3m 37s
997:	learn: 0.3729346	total: 7m 10s	remaining: 3m 36s
998:	learn: 0.3728846	total: 7m 11s	remaining: 3m 36s
999:	learn: 0.3728511	total: 7m 11s	remaining: 3m 35s
1000:	learn: 0.3728068	total: 7m 11s	remaining: 3m 35s
1001:	learn: 0.3727673	total: 7m 12s	remaining: 3m 34s
1002:	learn: 0.3727189	total: 7m 12s	remaining: 3m 34s
1003:	learn: 0.3726843	total: 7m 13s	remaining: 3m 34s
1004:	learn: 0.3726373	total: 7m 13s	remaining: 3m 33s
1005:	learn: 0.3726047	total: 7m 13s	remaining: 3m 33s
1006:	learn: 0.3725626	total: 7m 14s	remaining: 3m 32s
1007:	learn: 0.3725338	total: 7m 14s	remaining: 3m 32s
1008:	learn: 0.3724995	total: 7m 15s	remaining: 3m 31s
1009:	learn: 0.3724513	total: 7m 15s	remaining: 3m 31s
1010:	learn: 0.3724186	total: 7m 15s	remaining: 3m 30s
1011:	learn: 0.3723659	total: 7m 16s	remaining: 3m 30s
1012:	learn: 0.3723226	total: 7m 16s	remaining: 3m 29s
1013:	learn: 0.3722931	total: 7m 17s	remaining: 3m 29s
1014:	learn: 0.3722520	total: 7m 17s	remaining: 3m 29s
1015:	learn: 0.3722147	total: 7m 17s	remaining: 3m 28s
1016:	learn: 0.3721822	total: 7m 18s	remaining: 3m 28s
1017:	learn: 0.3721344	total: 7m 18s	remaining: 3m 27s
1018:	learn: 0.3720965	total: 7m 19s	remaining: 3m 27s
1019:	learn: 0.3720591	total: 7m 19s	remaining: 3m 26s
1020:	learn: 0.3720213	total: 7m 19s	remaining: 3m 26s
1021:	learn: 0.3719863	total: 7m 20s	remaining: 3m 25s
1022:	learn: 0.3719461	total: 7m 20s	remaining: 3m 25s
1023:	learn: 0.3719084	total: 7m 20s	remaining: 3m 24s
1024:	learn: 0.3718728	total: 7m 21s	remaining: 3m 24s
1025:	learn: 0.3718210	total: 7m 21s	remaining: 3m 24s
1026:	learn: 0.3717873	total: 7m 22s	remaining: 3m 23s
1027:	learn: 0.3717455	total: 7m 22s	remaining: 3m 23s
1028:	learn: 0.3716997	total: 7m 22s	remaining: 3m 22s
1029:	learn: 0.3716588	total: 7m 23s	remaining: 3m 22s
1030:	learn: 0.3716111	total: 7m 23s	remaining: 3m 21s
1031:	learn: 0.3715739	total: 7m 23s	remaining: 3m 21s
1032:	learn: 0.3715333	total: 7m 24s	remaining: 3m 20s
1033:	learn: 0.3714976	total: 7m 24s	remaining: 3m 20s
1034:	learn: 0.3714605	total: 7m 25s	remaining: 3m 20s
1035:	learn: 0.3714275	total: 7m 25s	remaining: 3m 19s
1036:	learn: 0.3713954	total: 7m 26s	remaining: 3m 19s
1037:	learn: 0.3713534	total: 7m 26s	remaining: 3m 18s
1038:	learn: 0.3713004	total: 7m 27s	remaining: 3m 18s
1039:	learn: 0.3712638	total: 7m 27s	remaining: 3m 17s
1040:	learn: 0.3712159	total: 7m 27s	remaining: 3m 17s
1041:	learn: 0.3711870	total: 7m 28s	remaining: 3m 17s
1042:	learn: 0.3711533	total: 7m 28s	remaining: 3m 16s
1043:	learn: 0.3711132	total: 7m 28s	remaining: 3m 16s
1044:	learn: 0.3710755	total: 7m 29s	remaining: 3m 15s
1045:	learn: 0.3710362	total: 7m 29s	remaining: 3m 15s
1046:	learn: 0.3709847	total: 7m 30s	remaining: 3m 14s
1047:	learn: 0.3709401	total: 7m 30s	remaining: 3m 14s
1048:	learn: 0.3708905	total: 7m 31s	remaining: 3m 13s
1049:	learn: 0.3708512	total: 7m 31s	remaining: 3m 13s
1050:	learn: 0.3708285	total: 7m 31s	remaining: 3m 12s
1051:	learn: 0.3707889	total: 7m 32s	remaining: 3m 12s
1052:	learn: 0.3707452	total: 7m 32s	remaining: 3m 12s
1053:	learn: 0.3707042	total: 7m 33s	remaining: 3m 11s
1054:	learn: 0.3706691	total: 7m 33s	remaining: 3m 11s
1055:	learn: 0.3706248	total: 7m 33s	remaining: 3m 10s
1056:	learn: 0.3705718	total: 7m 34s	remaining: 3m 10s
1057:	learn: 0.3705258	total: 7m 34s	remaining: 3m 10s
1058:	learn: 0.3704909	total: 7m 35s	remaining: 3m 9s
1059:	learn: 0.3704439	total: 7m 35s	remaining: 3m 9s
1060:	learn: 0.3704080	total: 7m 36s	remaining: 3m 8s
1061:	learn: 0.3703617	total: 7m 36s	remaining: 3m 8s
1062:	learn: 0.3703131	total: 7m 37s	remaining: 3m 7s
1063:	learn: 0.3702608	total: 7m 37s	remaining: 3m 7s
1064:	learn: 0.3702167	total: 7m 37s	remaining: 3m 7s
1065:	learn: 0.3701689	total: 7m 38s	remaining: 3m 6s
1066:	learn: 0.3701226	total: 7m 38s	remaining: 3m 6s
1067:	learn: 0.3700870	total: 7m 39s	remaining: 3m 5s
1068:	learn: 0.3700429	total: 7m 39s	remaining: 3m 5s
1069:	learn: 0.3699965	total: 7m 39s	remaining: 3m 4s
1070:	learn: 0.3699636	total: 7m 40s	remaining: 3m 4s
1071:	learn: 0.3699273	total: 7m 40s	remaining: 3m 3s
1072:	learn: 0.3698918	total: 7m 41s	remaining: 3m 3s
1073:	learn: 0.3698537	total: 7m 41s	remaining: 3m 3s
1074:	learn: 0.3698083	total: 7m 41s	remaining: 3m 2s
1075:	learn: 0.3697631	total: 7m 42s	remaining: 3m 2s
1076:	learn: 0.3697269	total: 7m 42s	remaining: 3m 1s
1077:	learn: 0.3696720	total: 7m 43s	remaining: 3m 1s
1078:	learn: 0.3696309	total: 7m 43s	remaining: 3m
1079:	learn: 0.3695920	total: 7m 44s	remaining: 3m
1080:	learn: 0.3695650	total: 7m 44s	remaining: 2m 59s
1081:	learn: 0.3695126	total: 7m 44s	remaining: 2m 59s
1082:	learn: 0.3694802	total: 7m 45s	remaining: 2m 59s
1083:	learn: 0.3694433	total: 7m 45s	remaining: 2m 58s
1084:	learn: 0.3694108	total: 7m 45s	remaining: 2m 58s
1085:	learn: 0.3693710	total: 7m 46s	remaining: 2m 57s
1086:	learn: 0.3693483	total: 7m 46s	remaining: 2m 57s
1087:	learn: 0.3693022	total: 7m 47s	remaining: 2m 56s
1088:	learn: 0.3692610	total: 7m 47s	remaining: 2m 56s
1089:	learn: 0.3692278	total: 7m 47s	remaining: 2m 55s
1090:	learn: 0.3691887	total: 7m 48s	remaining: 2m 55s
1091:	learn: 0.3691445	total: 7m 48s	remaining: 2m 55s
1092:	learn: 0.3691074	total: 7m 48s	remaining: 2m 54s
1093:	learn: 0.3690732	total: 7m 49s	remaining: 2m 54s
1094:	learn: 0.3690344	total: 7m 49s	remaining: 2m 53s
1095:	learn: 0.3689848	total: 7m 50s	remaining: 2m 53s
1096:	learn: 0.3689411	total: 7m 50s	remaining: 2m 52s
1097:	learn: 0.3688923	total: 7m 51s	remaining: 2m 52s
1098:	learn: 0.3688555	total: 7m 51s	remaining: 2m 52s
1099:	learn: 0.3688062	total: 7m 52s	remaining: 2m 51s
1100:	learn: 0.3687594	total: 7m 52s	remaining: 2m 51s
1101:	learn: 0.3687212	total: 7m 52s	remaining: 2m 50s
1102:	learn: 0.3686851	total: 7m 53s	remaining: 2m 50s
1103:	learn: 0.3686386	total: 7m 53s	remaining: 2m 49s
1104:	learn: 0.3685988	total: 7m 54s	remaining: 2m 49s
1105:	learn: 0.3685512	total: 7m 54s	remaining: 2m 49s
1106:	learn: 0.3685119	total: 7m 54s	remaining: 2m 48s
1107:	learn: 0.3684737	total: 7m 55s	remaining: 2m 48s
1108:	learn: 0.3684349	total: 7m 55s	remaining: 2m 47s
1109:	learn: 0.3683802	total: 7m 55s	remaining: 2m 47s
1110:	learn: 0.3683394	total: 7m 56s	remaining: 2m 46s
1111:	learn: 0.3683164	total: 7m 56s	remaining: 2m 46s
1112:	learn: 0.3682666	total: 7m 57s	remaining: 2m 45s
1113:	learn: 0.3682243	total: 7m 57s	remaining: 2m 45s
1114:	learn: 0.3681780	total: 7m 58s	remaining: 2m 45s
1115:	learn: 0.3681431	total: 7m 58s	remaining: 2m 44s
1116:	learn: 0.3680953	total: 7m 58s	remaining: 2m 44s
1117:	learn: 0.3680498	total: 7m 59s	remaining: 2m 43s
1118:	learn: 0.3680051	total: 7m 59s	remaining: 2m 43s
1119:	learn: 0.3679573	total: 8m	remaining: 2m 42s
1120:	learn: 0.3679162	total: 8m	remaining: 2m 42s
1121:	learn: 0.3678776	total: 8m	remaining: 2m 42s
1122:	learn: 0.3678266	total: 8m 1s	remaining: 2m 41s
1123:	learn: 0.3677841	total: 8m 1s	remaining: 2m 41s
1124:	learn: 0.3677458	total: 8m 2s	remaining: 2m 40s
1125:	learn: 0.3677080	total: 8m 2s	remaining: 2m 40s
1126:	learn: 0.3676655	total: 8m 3s	remaining: 2m 39s
1127:	learn: 0.3676312	total: 8m 3s	remaining: 2m 39s
1128:	learn: 0.3675960	total: 8m 3s	remaining: 2m 39s
1129:	learn: 0.3675652	total: 8m 4s	remaining: 2m 38s
1130:	learn: 0.3675188	total: 8m 4s	remaining: 2m 38s
1131:	learn: 0.3674856	total: 8m 5s	remaining: 2m 37s
1132:	learn: 0.3674437	total: 8m 5s	remaining: 2m 37s
1133:	learn: 0.3674135	total: 8m 5s	remaining: 2m 36s
1134:	learn: 0.3673647	total: 8m 6s	remaining: 2m 36s
1135:	learn: 0.3673169	total: 8m 6s	remaining: 2m 35s
1136:	learn: 0.3672736	total: 8m 7s	remaining: 2m 35s
1137:	learn: 0.3672305	total: 8m 7s	remaining: 2m 35s
1138:	learn: 0.3671884	total: 8m 8s	remaining: 2m 34s
1139:	learn: 0.3671572	total: 8m 8s	remaining: 2m 34s
1140:	learn: 0.3671132	total: 8m 9s	remaining: 2m 33s
1141:	learn: 0.3670690	total: 8m 9s	remaining: 2m 33s
1142:	learn: 0.3670336	total: 8m 9s	remaining: 2m 33s
1143:	learn: 0.3669974	total: 8m 10s	remaining: 2m 32s
1144:	learn: 0.3669616	total: 8m 10s	remaining: 2m 32s
1145:	learn: 0.3669316	total: 8m 11s	remaining: 2m 31s
1146:	learn: 0.3668938	total: 8m 11s	remaining: 2m 31s
1147:	learn: 0.3668554	total: 8m 11s	remaining: 2m 30s
1148:	learn: 0.3668135	total: 8m 12s	remaining: 2m 30s
1149:	learn: 0.3667698	total: 8m 12s	remaining: 2m 29s
1150:	learn: 0.3667366	total: 8m 13s	remaining: 2m 29s
1151:	learn: 0.3666940	total: 8m 13s	remaining: 2m 29s
1152:	learn: 0.3666561	total: 8m 14s	remaining: 2m 28s
1153:	learn: 0.3666158	total: 8m 14s	remaining: 2m 28s
1154:	learn: 0.3665729	total: 8m 14s	remaining: 2m 27s
1155:	learn: 0.3665359	total: 8m 15s	remaining: 2m 27s
1156:	learn: 0.3665034	total: 8m 15s	remaining: 2m 26s
1157:	learn: 0.3664580	total: 8m 16s	remaining: 2m 26s
1158:	learn: 0.3664256	total: 8m 16s	remaining: 2m 26s
1159:	learn: 0.3663935	total: 8m 17s	remaining: 2m 25s
1160:	learn: 0.3663544	total: 8m 17s	remaining: 2m 25s
1161:	learn: 0.3663184	total: 8m 18s	remaining: 2m 24s
1162:	learn: 0.3662847	total: 8m 18s	remaining: 2m 24s
1163:	learn: 0.3662513	total: 8m 19s	remaining: 2m 24s
1164:	learn: 0.3662147	total: 8m 19s	remaining: 2m 23s
1165:	learn: 0.3661979	total: 8m 19s	remaining: 2m 23s
1166:	learn: 0.3661568	total: 8m 20s	remaining: 2m 22s
1167:	learn: 0.3661135	total: 8m 20s	remaining: 2m 22s
1168:	learn: 0.3660772	total: 8m 21s	remaining: 2m 21s
1169:	learn: 0.3660451	total: 8m 21s	remaining: 2m 21s
1170:	learn: 0.3660028	total: 8m 21s	remaining: 2m 21s
1171:	learn: 0.3659592	total: 8m 22s	remaining: 2m 20s
1172:	learn: 0.3659206	total: 8m 22s	remaining: 2m 20s
1173:	learn: 0.3658901	total: 8m 23s	remaining: 2m 19s
1174:	learn: 0.3658661	total: 8m 23s	remaining: 2m 19s
1175:	learn: 0.3658300	total: 8m 24s	remaining: 2m 18s
1176:	learn: 0.3657841	total: 8m 24s	remaining: 2m 18s
1177:	learn: 0.3657467	total: 8m 25s	remaining: 2m 18s
1178:	learn: 0.3657145	total: 8m 25s	remaining: 2m 17s
1179:	learn: 0.3656784	total: 8m 26s	remaining: 2m 17s
1180:	learn: 0.3656317	total: 8m 26s	remaining: 2m 16s
1181:	learn: 0.3656016	total: 8m 26s	remaining: 2m 16s
1182:	learn: 0.3655566	total: 8m 27s	remaining: 2m 15s
1183:	learn: 0.3655184	total: 8m 27s	remaining: 2m 15s
1184:	learn: 0.3654930	total: 8m 28s	remaining: 2m 15s
1185:	learn: 0.3654608	total: 8m 28s	remaining: 2m 14s
1186:	learn: 0.3654135	total: 8m 29s	remaining: 2m 14s
1187:	learn: 0.3653775	total: 8m 29s	remaining: 2m 13s
1188:	learn: 0.3653525	total: 8m 29s	remaining: 2m 13s
1189:	learn: 0.3653133	total: 8m 30s	remaining: 2m 12s
1190:	learn: 0.3652647	total: 8m 30s	remaining: 2m 12s
1191:	learn: 0.3652291	total: 8m 31s	remaining: 2m 12s
1192:	learn: 0.3651910	total: 8m 31s	remaining: 2m 11s
1193:	learn: 0.3651532	total: 8m 32s	remaining: 2m 11s
1194:	learn: 0.3651051	total: 8m 32s	remaining: 2m 10s
1195:	learn: 0.3650583	total: 8m 33s	remaining: 2m 10s
1196:	learn: 0.3650189	total: 8m 33s	remaining: 2m 9s
1197:	learn: 0.3649697	total: 8m 33s	remaining: 2m 9s
1198:	learn: 0.3649318	total: 8m 34s	remaining: 2m 9s
1199:	learn: 0.3648882	total: 8m 34s	remaining: 2m 8s
1200:	learn: 0.3648434	total: 8m 35s	remaining: 2m 8s
1201:	learn: 0.3648031	total: 8m 35s	remaining: 2m 7s
1202:	learn: 0.3647674	total: 8m 35s	remaining: 2m 7s
1203:	learn: 0.3647280	total: 8m 36s	remaining: 2m 6s
1204:	learn: 0.3646997	total: 8m 36s	remaining: 2m 6s
1205:	learn: 0.3646750	total: 8m 37s	remaining: 2m 6s
1206:	learn: 0.3646364	total: 8m 37s	remaining: 2m 5s
1207:	learn: 0.3645835	total: 8m 38s	remaining: 2m 5s
1208:	learn: 0.3645542	total: 8m 38s	remaining: 2m 4s
1209:	learn: 0.3645117	total: 8m 39s	remaining: 2m 4s
1210:	learn: 0.3644680	total: 8m 39s	remaining: 2m 4s
1211:	learn: 0.3644259	total: 8m 40s	remaining: 2m 3s
1212:	learn: 0.3643756	total: 8m 40s	remaining: 2m 3s
1213:	learn: 0.3643315	total: 8m 41s	remaining: 2m 2s
1214:	learn: 0.3643004	total: 8m 41s	remaining: 2m 2s
1215:	learn: 0.3642776	total: 8m 41s	remaining: 2m 1s
1216:	learn: 0.3642303	total: 8m 42s	remaining: 2m 1s
1217:	learn: 0.3641888	total: 8m 42s	remaining: 2m 1s
1218:	learn: 0.3641477	total: 8m 43s	remaining: 2m
1219:	learn: 0.3641161	total: 8m 43s	remaining: 2m
1220:	learn: 0.3640872	total: 8m 44s	remaining: 1m 59s
1221:	learn: 0.3640533	total: 8m 44s	remaining: 1m 59s
1222:	learn: 0.3640070	total: 8m 44s	remaining: 1m 58s
1223:	learn: 0.3639650	total: 8m 45s	remaining: 1m 58s
1224:	learn: 0.3639194	total: 8m 45s	remaining: 1m 57s
1225:	learn: 0.3638921	total: 8m 45s	remaining: 1m 57s
1226:	learn: 0.3638524	total: 8m 46s	remaining: 1m 57s
1227:	learn: 0.3638045	total: 8m 46s	remaining: 1m 56s
1228:	learn: 0.3637661	total: 8m 47s	remaining: 1m 56s
1229:	learn: 0.3637137	total: 8m 47s	remaining: 1m 55s
1230:	learn: 0.3636761	total: 8m 47s	remaining: 1m 55s
1231:	learn: 0.3636373	total: 8m 48s	remaining: 1m 54s
1232:	learn: 0.3635932	total: 8m 48s	remaining: 1m 54s
1233:	learn: 0.3635517	total: 8m 49s	remaining: 1m 54s
1234:	learn: 0.3635191	total: 8m 49s	remaining: 1m 53s
1235:	learn: 0.3634958	total: 8m 49s	remaining: 1m 53s
1236:	learn: 0.3634642	total: 8m 50s	remaining: 1m 52s
1237:	learn: 0.3634392	total: 8m 50s	remaining: 1m 52s
1238:	learn: 0.3634079	total: 8m 51s	remaining: 1m 51s
1239:	learn: 0.3633652	total: 8m 51s	remaining: 1m 51s
1240:	learn: 0.3633188	total: 8m 51s	remaining: 1m 51s
1241:	learn: 0.3632858	total: 8m 52s	remaining: 1m 50s
1242:	learn: 0.3632524	total: 8m 52s	remaining: 1m 50s
1243:	learn: 0.3632174	total: 8m 53s	remaining: 1m 49s
1244:	learn: 0.3631820	total: 8m 53s	remaining: 1m 49s
1245:	learn: 0.3631531	total: 8m 53s	remaining: 1m 48s
1246:	learn: 0.3631211	total: 8m 54s	remaining: 1m 48s
1247:	learn: 0.3630824	total: 8m 55s	remaining: 1m 48s
1248:	learn: 0.3630559	total: 8m 55s	remaining: 1m 47s
1249:	learn: 0.3630259	total: 8m 55s	remaining: 1m 47s
1250:	learn: 0.3629854	total: 8m 56s	remaining: 1m 46s
1251:	learn: 0.3629433	total: 8m 56s	remaining: 1m 46s
1252:	learn: 0.3629075	total: 8m 57s	remaining: 1m 45s
1253:	learn: 0.3628803	total: 8m 57s	remaining: 1m 45s
1254:	learn: 0.3628370	total: 8m 57s	remaining: 1m 44s
1255:	learn: 0.3627948	total: 8m 58s	remaining: 1m 44s
1256:	learn: 0.3627485	total: 8m 58s	remaining: 1m 44s
1257:	learn: 0.3627068	total: 8m 58s	remaining: 1m 43s
1258:	learn: 0.3626571	total: 8m 59s	remaining: 1m 43s
1259:	learn: 0.3626301	total: 8m 59s	remaining: 1m 42s
1260:	learn: 0.3625925	total: 9m	remaining: 1m 42s
1261:	learn: 0.3625613	total: 9m	remaining: 1m 41s
1262:	learn: 0.3625252	total: 9m 1s	remaining: 1m 41s
1263:	learn: 0.3624957	total: 9m 1s	remaining: 1m 41s
1264:	learn: 0.3624583	total: 9m 1s	remaining: 1m 40s
1265:	learn: 0.3624344	total: 9m 2s	remaining: 1m 40s
1266:	learn: 0.3624064	total: 9m 2s	remaining: 1m 39s
1267:	learn: 0.3623640	total: 9m 3s	remaining: 1m 39s
1268:	learn: 0.3623247	total: 9m 4s	remaining: 1m 39s
1269:	learn: 0.3622834	total: 9m 4s	remaining: 1m 38s
1270:	learn: 0.3622478	total: 9m 4s	remaining: 1m 38s
1271:	learn: 0.3622147	total: 9m 5s	remaining: 1m 37s
1272:	learn: 0.3621721	total: 9m 5s	remaining: 1m 37s
1273:	learn: 0.3621288	total: 9m 6s	remaining: 1m 36s
1274:	learn: 0.3620863	total: 9m 6s	remaining: 1m 36s
1275:	learn: 0.3620427	total: 9m 7s	remaining: 1m 36s
1276:	learn: 0.3620069	total: 9m 7s	remaining: 1m 35s
1277:	learn: 0.3619838	total: 9m 7s	remaining: 1m 35s
1278:	learn: 0.3619362	total: 9m 8s	remaining: 1m 34s
1279:	learn: 0.3618956	total: 9m 8s	remaining: 1m 34s
1280:	learn: 0.3618479	total: 9m 9s	remaining: 1m 33s
1281:	learn: 0.3618124	total: 9m 9s	remaining: 1m 33s
1282:	learn: 0.3618117	total: 9m 9s	remaining: 1m 32s
1283:	learn: 0.3617869	total: 9m 10s	remaining: 1m 32s
1284:	learn: 0.3617399	total: 9m 10s	remaining: 1m 32s
1285:	learn: 0.3617094	total: 9m 11s	remaining: 1m 31s
1286:	learn: 0.3616791	total: 9m 11s	remaining: 1m 31s
1287:	learn: 0.3616451	total: 9m 11s	remaining: 1m 30s
1288:	learn: 0.3616095	total: 9m 12s	remaining: 1m 30s
1289:	learn: 0.3615709	total: 9m 12s	remaining: 1m 29s
1290:	learn: 0.3615262	total: 9m 13s	remaining: 1m 29s
1291:	learn: 0.3614911	total: 9m 13s	remaining: 1m 29s
1292:	learn: 0.3614480	total: 9m 14s	remaining: 1m 28s
1293:	learn: 0.3614019	total: 9m 14s	remaining: 1m 28s
1294:	learn: 0.3613591	total: 9m 14s	remaining: 1m 27s
1295:	learn: 0.3613151	total: 9m 15s	remaining: 1m 27s
1296:	learn: 0.3612801	total: 9m 15s	remaining: 1m 26s
1297:	learn: 0.3612572	total: 9m 16s	remaining: 1m 26s
1298:	learn: 0.3612212	total: 9m 16s	remaining: 1m 26s
1299:	learn: 0.3611956	total: 9m 16s	remaining: 1m 25s
1300:	learn: 0.3611496	total: 9m 17s	remaining: 1m 25s
1301:	learn: 0.3611109	total: 9m 17s	remaining: 1m 24s
1302:	learn: 0.3610725	total: 9m 18s	remaining: 1m 24s
1303:	learn: 0.3610329	total: 9m 18s	remaining: 1m 23s
1304:	learn: 0.3609956	total: 9m 18s	remaining: 1m 23s
1305:	learn: 0.3609540	total: 9m 19s	remaining: 1m 23s
1306:	learn: 0.3609204	total: 9m 19s	remaining: 1m 22s
1307:	learn: 0.3608819	total: 9m 20s	remaining: 1m 22s
1308:	learn: 0.3608518	total: 9m 20s	remaining: 1m 21s
1309:	learn: 0.3608179	total: 9m 20s	remaining: 1m 21s
1310:	learn: 0.3607845	total: 9m 21s	remaining: 1m 20s
1311:	learn: 0.3607558	total: 9m 21s	remaining: 1m 20s
1312:	learn: 0.3607105	total: 9m 21s	remaining: 1m 20s
1313:	learn: 0.3606858	total: 9m 22s	remaining: 1m 19s
1314:	learn: 0.3606524	total: 9m 22s	remaining: 1m 19s
1315:	learn: 0.3606155	total: 9m 22s	remaining: 1m 18s
1316:	learn: 0.3605676	total: 9m 23s	remaining: 1m 18s
1317:	learn: 0.3605240	total: 9m 23s	remaining: 1m 17s
1318:	learn: 0.3604903	total: 9m 24s	remaining: 1m 17s
1319:	learn: 0.3604608	total: 9m 24s	remaining: 1m 16s
1320:	learn: 0.3604399	total: 9m 24s	remaining: 1m 16s
1321:	learn: 0.3604397	total: 9m 25s	remaining: 1m 16s
1322:	learn: 0.3604103	total: 9m 25s	remaining: 1m 15s
1323:	learn: 0.3603777	total: 9m 25s	remaining: 1m 15s
1324:	learn: 0.3603364	total: 9m 26s	remaining: 1m 14s
1325:	learn: 0.3603021	total: 9m 26s	remaining: 1m 14s
1326:	learn: 0.3602685	total: 9m 27s	remaining: 1m 13s
1327:	learn: 0.3602434	total: 9m 27s	remaining: 1m 13s
1328:	learn: 0.3602044	total: 9m 27s	remaining: 1m 13s
1329:	learn: 0.3601601	total: 9m 28s	remaining: 1m 12s
1330:	learn: 0.3601196	total: 9m 28s	remaining: 1m 12s
1331:	learn: 0.3600764	total: 9m 29s	remaining: 1m 11s
1332:	learn: 0.3600345	total: 9m 29s	remaining: 1m 11s
1333:	learn: 0.3600035	total: 9m 29s	remaining: 1m 10s
1334:	learn: 0.3599570	total: 9m 30s	remaining: 1m 10s
1335:	learn: 0.3599274	total: 9m 30s	remaining: 1m 10s
1336:	learn: 0.3598861	total: 9m 31s	remaining: 1m 9s
1337:	learn: 0.3598428	total: 9m 31s	remaining: 1m 9s
1338:	learn: 0.3598113	total: 9m 32s	remaining: 1m 8s
1339:	learn: 0.3597696	total: 9m 32s	remaining: 1m 8s
1340:	learn: 0.3597447	total: 9m 32s	remaining: 1m 7s
1341:	learn: 0.3597161	total: 9m 33s	remaining: 1m 7s
1342:	learn: 0.3596645	total: 9m 33s	remaining: 1m 7s
1343:	learn: 0.3596185	total: 9m 34s	remaining: 1m 6s
1344:	learn: 0.3595836	total: 9m 34s	remaining: 1m 6s
1345:	learn: 0.3595372	total: 9m 34s	remaining: 1m 5s
1346:	learn: 0.3595005	total: 9m 35s	remaining: 1m 5s
1347:	learn: 0.3594659	total: 9m 35s	remaining: 1m 4s
1348:	learn: 0.3594409	total: 9m 36s	remaining: 1m 4s
1349:	learn: 0.3594034	total: 9m 36s	remaining: 1m 4s
1350:	learn: 0.3593626	total: 9m 36s	remaining: 1m 3s
1351:	learn: 0.3593261	total: 9m 37s	remaining: 1m 3s
1352:	learn: 0.3592867	total: 9m 37s	remaining: 1m 2s
1353:	learn: 0.3592569	total: 9m 38s	remaining: 1m 2s
1354:	learn: 0.3592222	total: 9m 38s	remaining: 1m 1s
1355:	learn: 0.3591865	total: 9m 39s	remaining: 1m 1s
1356:	learn: 0.3591428	total: 9m 39s	remaining: 1m 1s
1357:	learn: 0.3591152	total: 9m 40s	remaining: 1m
1358:	learn: 0.3590779	total: 9m 40s	remaining: 1m
1359:	learn: 0.3590489	total: 9m 40s	remaining: 59.8s
1360:	learn: 0.3590033	total: 9m 41s	remaining: 59.4s
1361:	learn: 0.3589753	total: 9m 41s	remaining: 58.9s
1362:	learn: 0.3589455	total: 9m 42s	remaining: 58.5s
1363:	learn: 0.3589143	total: 9m 42s	remaining: 58.1s
1364:	learn: 0.3588816	total: 9m 43s	remaining: 57.7s
1365:	learn: 0.3588497	total: 9m 43s	remaining: 57.2s
1366:	learn: 0.3588118	total: 9m 43s	remaining: 56.8s
1367:	learn: 0.3587751	total: 9m 44s	remaining: 56.4s
1368:	learn: 0.3587350	total: 9m 44s	remaining: 56s
1369:	learn: 0.3587027	total: 9m 45s	remaining: 55.5s
1370:	learn: 0.3586746	total: 9m 45s	remaining: 55.1s
1371:	learn: 0.3586368	total: 9m 45s	remaining: 54.7s
1372:	learn: 0.3586023	total: 9m 46s	remaining: 54.2s
1373:	learn: 0.3585692	total: 9m 46s	remaining: 53.8s
1374:	learn: 0.3585375	total: 9m 47s	remaining: 53.4s
1375:	learn: 0.3585054	total: 9m 47s	remaining: 52.9s
1376:	learn: 0.3584781	total: 9m 47s	remaining: 52.5s
1377:	learn: 0.3584406	total: 9m 48s	remaining: 52.1s
1378:	learn: 0.3584071	total: 9m 48s	remaining: 51.7s
1379:	learn: 0.3583676	total: 9m 49s	remaining: 51.2s
1380:	learn: 0.3583360	total: 9m 49s	remaining: 50.8s
1381:	learn: 0.3583001	total: 9m 50s	remaining: 50.4s
1382:	learn: 0.3582651	total: 9m 50s	remaining: 50s
1383:	learn: 0.3582274	total: 9m 50s	remaining: 49.5s
1384:	learn: 0.3581811	total: 9m 51s	remaining: 49.1s
1385:	learn: 0.3581368	total: 9m 51s	remaining: 48.7s
1386:	learn: 0.3580953	total: 9m 52s	remaining: 48.2s
1387:	learn: 0.3580574	total: 9m 52s	remaining: 47.8s
1388:	learn: 0.3580153	total: 9m 52s	remaining: 47.4s
1389:	learn: 0.3579778	total: 9m 53s	remaining: 47s
1390:	learn: 0.3579419	total: 9m 53s	remaining: 46.5s
1391:	learn: 0.3579162	total: 9m 54s	remaining: 46.1s
1392:	learn: 0.3578906	total: 9m 54s	remaining: 45.7s
1393:	learn: 0.3578467	total: 9m 55s	remaining: 45.2s
1394:	learn: 0.3578204	total: 9m 55s	remaining: 44.8s
1395:	learn: 0.3577868	total: 9m 55s	remaining: 44.4s
1396:	learn: 0.3577419	total: 9m 56s	remaining: 44s
1397:	learn: 0.3577120	total: 9m 56s	remaining: 43.5s
1398:	learn: 0.3576807	total: 9m 57s	remaining: 43.1s
1399:	learn: 0.3576394	total: 9m 57s	remaining: 42.7s
1400:	learn: 0.3576007	total: 9m 58s	remaining: 42.3s
1401:	learn: 0.3575551	total: 9m 58s	remaining: 41.8s
1402:	learn: 0.3575182	total: 9m 59s	remaining: 41.4s
1403:	learn: 0.3574893	total: 9m 59s	remaining: 41s
1404:	learn: 0.3574557	total: 9m 59s	remaining: 40.6s
1405:	learn: 0.3574165	total: 10m	remaining: 40.1s
1406:	learn: 0.3573831	total: 10m	remaining: 39.7s
1407:	learn: 0.3573492	total: 10m 1s	remaining: 39.3s
1408:	learn: 0.3573344	total: 10m 1s	remaining: 38.8s
1409:	learn: 0.3572971	total: 10m 1s	remaining: 38.4s
1410:	learn: 0.3572682	total: 10m 2s	remaining: 38s
1411:	learn: 0.3572147	total: 10m 2s	remaining: 37.6s
1412:	learn: 0.3571682	total: 10m 3s	remaining: 37.1s
1413:	learn: 0.3571295	total: 10m 3s	remaining: 36.7s
1414:	learn: 0.3570965	total: 10m 3s	remaining: 36.3s
1415:	learn: 0.3570531	total: 10m 4s	remaining: 35.9s
1416:	learn: 0.3570162	total: 10m 4s	remaining: 35.4s
1417:	learn: 0.3569769	total: 10m 5s	remaining: 35s
1418:	learn: 0.3569369	total: 10m 5s	remaining: 34.6s
1419:	learn: 0.3569122	total: 10m 5s	remaining: 34.1s
1420:	learn: 0.3568749	total: 10m 6s	remaining: 33.7s
1421:	learn: 0.3568346	total: 10m 6s	remaining: 33.3s
1422:	learn: 0.3567916	total: 10m 7s	remaining: 32.9s
1423:	learn: 0.3567545	total: 10m 7s	remaining: 32.4s
1424:	learn: 0.3567188	total: 10m 8s	remaining: 32s
1425:	learn: 0.3566779	total: 10m 8s	remaining: 31.6s
1426:	learn: 0.3566361	total: 10m 8s	remaining: 31.2s
1427:	learn: 0.3565962	total: 10m 9s	remaining: 30.7s
1428:	learn: 0.3565620	total: 10m 9s	remaining: 30.3s
1429:	learn: 0.3565300	total: 10m 10s	remaining: 29.9s
1430:	learn: 0.3564898	total: 10m 10s	remaining: 29.4s
1431:	learn: 0.3564516	total: 10m 10s	remaining: 29s
1432:	learn: 0.3564187	total: 10m 11s	remaining: 28.6s
1433:	learn: 0.3563757	total: 10m 11s	remaining: 28.2s
1434:	learn: 0.3563407	total: 10m 12s	remaining: 27.7s
1435:	learn: 0.3563119	total: 10m 12s	remaining: 27.3s
1436:	learn: 0.3562896	total: 10m 12s	remaining: 26.9s
1437:	learn: 0.3562494	total: 10m 13s	remaining: 26.5s
1438:	learn: 0.3562208	total: 10m 13s	remaining: 26s
1439:	learn: 0.3561776	total: 10m 14s	remaining: 25.6s
1440:	learn: 0.3561492	total: 10m 14s	remaining: 25.2s
1441:	learn: 0.3561027	total: 10m 15s	remaining: 24.8s
1442:	learn: 0.3560690	total: 10m 15s	remaining: 24.3s
1443:	learn: 0.3560342	total: 10m 16s	remaining: 23.9s
1444:	learn: 0.3559868	total: 10m 16s	remaining: 23.5s
1445:	learn: 0.3559468	total: 10m 17s	remaining: 23s
1446:	learn: 0.3559063	total: 10m 17s	remaining: 22.6s
1447:	learn: 0.3558726	total: 10m 17s	remaining: 22.2s
1448:	learn: 0.3558471	total: 10m 18s	remaining: 21.8s
1449:	learn: 0.3558135	total: 10m 18s	remaining: 21.3s
1450:	learn: 0.3557731	total: 10m 19s	remaining: 20.9s
1451:	learn: 0.3557313	total: 10m 19s	remaining: 20.5s
1452:	learn: 0.3556908	total: 10m 20s	remaining: 20.1s
1453:	learn: 0.3556511	total: 10m 20s	remaining: 19.6s
1454:	learn: 0.3556198	total: 10m 21s	remaining: 19.2s
1455:	learn: 0.3555939	total: 10m 21s	remaining: 18.8s
1456:	learn: 0.3555573	total: 10m 22s	remaining: 18.4s
1457:	learn: 0.3555292	total: 10m 22s	remaining: 17.9s
1458:	learn: 0.3554862	total: 10m 23s	remaining: 17.5s
1459:	learn: 0.3554594	total: 10m 23s	remaining: 17.1s
1460:	learn: 0.3554232	total: 10m 23s	remaining: 16.7s
1461:	learn: 0.3553767	total: 10m 24s	remaining: 16.2s
1462:	learn: 0.3553325	total: 10m 24s	remaining: 15.8s
1463:	learn: 0.3552885	total: 10m 25s	remaining: 15.4s
1464:	learn: 0.3552464	total: 10m 25s	remaining: 14.9s
1465:	learn: 0.3552209	total: 10m 26s	remaining: 14.5s
1466:	learn: 0.3551804	total: 10m 26s	remaining: 14.1s
1467:	learn: 0.3551466	total: 10m 27s	remaining: 13.7s
1468:	learn: 0.3551068	total: 10m 27s	remaining: 13.2s
1469:	learn: 0.3550768	total: 10m 28s	remaining: 12.8s
1470:	learn: 0.3550445	total: 10m 28s	remaining: 12.4s
1471:	learn: 0.3550092	total: 10m 28s	remaining: 12s
1472:	learn: 0.3549719	total: 10m 29s	remaining: 11.5s
1473:	learn: 0.3549292	total: 10m 29s	remaining: 11.1s
1474:	learn: 0.3548902	total: 10m 30s	remaining: 10.7s
1475:	learn: 0.3548530	total: 10m 30s	remaining: 10.3s
1476:	learn: 0.3548173	total: 10m 31s	remaining: 9.83s
1477:	learn: 0.3547724	total: 10m 31s	remaining: 9.4s
1478:	learn: 0.3547346	total: 10m 32s	remaining: 8.97s
1479:	learn: 0.3546904	total: 10m 32s	remaining: 8.55s
1480:	learn: 0.3546629	total: 10m 32s	remaining: 8.12s
1481:	learn: 0.3546294	total: 10m 33s	remaining: 7.69s
1482:	learn: 0.3545795	total: 10m 33s	remaining: 7.27s
1483:	learn: 0.3545586	total: 10m 34s	remaining: 6.84s
1484:	learn: 0.3545189	total: 10m 34s	remaining: 6.41s
1485:	learn: 0.3544780	total: 10m 35s	remaining: 5.98s
1486:	learn: 0.3544383	total: 10m 35s	remaining: 5.56s
1487:	learn: 0.3544031	total: 10m 36s	remaining: 5.13s
1488:	learn: 0.3543657	total: 10m 36s	remaining: 4.7s
1489:	learn: 0.3543306	total: 10m 36s	remaining: 4.27s
1490:	learn: 0.3542906	total: 10m 37s	remaining: 3.85s
1491:	learn: 0.3542630	total: 10m 37s	remaining: 3.42s
1492:	learn: 0.3542280	total: 10m 38s	remaining: 2.99s
1493:	learn: 0.3541917	total: 10m 38s	remaining: 2.56s
1494:	learn: 0.3541466	total: 10m 39s	remaining: 2.14s
1495:	learn: 0.3541201	total: 10m 39s	remaining: 1.71s
1496:	learn: 0.3540851	total: 10m 39s	remaining: 1.28s
1497:	learn: 0.3540437	total: 10m 40s	remaining: 855ms
1498:	learn: 0.3540166	total: 10m 40s	remaining: 427ms
1499:	learn: 0.3539764	total: 10m 41s	remaining: 0us
Wall time: 26min 28s
Out[18]:
VotingClassifier(estimators=[('xgbc',
                              XGBClassifier(base_score=None, booster='gbtree',
                                            colsample_bylevel=None,
                                            colsample_bynode=1,
                                            colsample_bytree=1,
                                            enable_categorical=False, gamma=0,
                                            gpu_id=None, importance_type=None,
                                            interaction_constraints=None,
                                            learning_rate=0.3,
                                            max_delta_step=None, max_depth=6,
                                            min_child_weight=None, missing=nan,
                                            monotone_constraints=None,
                                            n_estimators=103, n_jobs=None,
                                            num_parallel_tree=None,
                                            predictor=None, random_state=57,
                                            reg_alpha=None, reg_lambda=None,
                                            scale_pos_weight=None, subsample=1,
                                            tree_method=None,
                                            validate_parameters=None,
                                            verbosity=None)),
                             ('lgbc',
                              LGBMClassifier(n_estimators=2000,
                                             objective='binary',
                                             random_state=57)),
                             ('catgbc',
                              <catboost.core.CatBoostClassifier object at 0x0000020E210D8B50>)])

Make the prediction

We prepare the submission of the model to the data challenge site. We save the model predictions in a text file that will be uploaded on the data challenge website

We make the prediction on the testing dataframe enlarged X_test_dataframe_enlarged

In [29]:
# Classify the provided test data
y_test = best_model.predict(X_test_dataframe_enlarged).astype(np.int8)
np.savetxt('enlarged_voting_y_test_challenge_student_V2.txt', y_test, fmt='%i' , delimiter=',')

Results Summary

Model Feature Engineering Hard Voting Soft Voting Score FNR+FPR rate (Valid test) 1-(FNR+FPR) (Valid Test)
Adaboost (baseline) Features Selection No No 0.52 0.48
Gradient Boosting Features Selection No No 0.54 0.46
XGBoost Features Selection No No 0.45 0.55
XGBoost Initial features No No 0.40 0.60
LightGBM Initial features No No 0.39 0.61
CatBoost Initial features No No 0.387 0.613
XGBoost + LightGBM + CatBoost Initial features Yes No 0.382 0.618
XGBoost + LightGBM + CatBoost Initial features No Yes 0.384 0.616
XGBoost + LightGBM + CatBoost Enlarged features Yes No 0.379 0.621
Neural Network Initial features No No 0.44 0.56

Conclusion

For this data challenge I followed the following classic machine learning steps:

1/ Data investigation

2/ Data preprocessing

- drop duplicate elements
- convert type of columns

3/ Features selection

- select the best features to simplify the classifiction task

4/ Apply Machine learning algorithms

5/ Apply a neural network

6/ Generate new features

7/ Fit the best model

8/ Predict the label with the best model

We could see that the features selection did not have the expected effect, since when I train a model on the whole dataset without going through the feature selection step, it gives better results as we could see with the model XGBoostand the others boosting models.

Overall, the boosting models provide the best performance. The best performing boosting model for this data challenge seems to be the CatBoost algorithm. But I managed to improve the performance by doing a majority vote between XGBoost, LightGBM, CatBoost

In addition, I managed to further improve the performance of my model by creating new features via linear combinations between features. Extending the number of explicative variables of the input dataset seems to improve the performance of the model

Moreover Recently, deep learning models are state-of-the-art models for image classification tasks, so one could easily deduce that neural networks are the best performing model for this data challenge. However, it seems that this is not the case, as the neural network I implemented does not produce better performance than the boosting models.

Through this data challenge, I realized that refining the hyperparameters of machine learning models is not an easy thing. However, when hyperparameters are well chosen, they usually lead to better performing models. Also, it was noticed that boosting models are better when they work together thanks to and when they are combined with a majority voting system.